00:00:00.000 Started by upstream project "autotest-nightly-lts" build number 2410 00:00:00.000 originally caused by: 00:00:00.000 Started by upstream project "nightly-trigger" build number 3671 00:00:00.000 originally caused by: 00:00:00.000 Started by timer 00:00:00.039 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.043 The recommended git tool is: git 00:00:00.044 using credential 00000000-0000-0000-0000-000000000002 00:00:00.045 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.054 Fetching changes from the remote Git repository 00:00:00.055 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.065 Using shallow fetch with depth 1 00:00:00.065 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.065 > git --version # timeout=10 00:00:00.076 > git --version # 'git version 2.39.2' 00:00:00.076 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.091 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.091 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:06.791 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:06.807 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:06.820 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:06.820 > git config core.sparsecheckout # timeout=10 00:00:06.833 > git read-tree -mu HEAD # timeout=10 00:00:06.850 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:06.871 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:06.871 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:06.953 [Pipeline] Start of Pipeline 00:00:06.969 [Pipeline] library 00:00:06.971 Loading library shm_lib@master 00:00:06.971 Library shm_lib@master is cached. Copying from home. 00:00:06.987 [Pipeline] node 00:00:07.004 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:07.006 [Pipeline] { 00:00:07.018 [Pipeline] catchError 00:00:07.020 [Pipeline] { 00:00:07.037 [Pipeline] wrap 00:00:07.050 [Pipeline] { 00:00:07.061 [Pipeline] stage 00:00:07.064 [Pipeline] { (Prologue) 00:00:07.381 [Pipeline] sh 00:00:07.669 + logger -p user.info -t JENKINS-CI 00:00:07.685 [Pipeline] echo 00:00:07.687 Node: WFP20 00:00:07.693 [Pipeline] sh 00:00:08.017 [Pipeline] setCustomBuildProperty 00:00:08.031 [Pipeline] echo 00:00:08.032 Cleanup processes 00:00:08.038 [Pipeline] sh 00:00:08.324 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:08.324 4103986 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:08.338 [Pipeline] sh 00:00:08.626 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:08.626 ++ grep -v 'sudo pgrep' 00:00:08.626 ++ awk '{print $1}' 00:00:08.626 + sudo kill -9 00:00:08.626 + true 00:00:08.641 [Pipeline] cleanWs 00:00:08.650 [WS-CLEANUP] Deleting project workspace... 00:00:08.651 [WS-CLEANUP] Deferred wipeout is used... 00:00:08.657 [WS-CLEANUP] done 00:00:08.661 [Pipeline] setCustomBuildProperty 00:00:08.675 [Pipeline] sh 00:00:08.961 + sudo git config --global --replace-all safe.directory '*' 00:00:09.057 [Pipeline] httpRequest 00:00:09.679 [Pipeline] echo 00:00:09.681 Sorcerer 10.211.164.20 is alive 00:00:09.690 [Pipeline] retry 00:00:09.692 [Pipeline] { 00:00:09.707 [Pipeline] httpRequest 00:00:09.712 HttpMethod: GET 00:00:09.712 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:09.713 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:09.726 Response Code: HTTP/1.1 200 OK 00:00:09.726 Success: Status code 200 is in the accepted range: 200,404 00:00:09.729 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:14.243 [Pipeline] } 00:00:14.261 [Pipeline] // retry 00:00:14.268 [Pipeline] sh 00:00:14.555 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:14.569 [Pipeline] httpRequest 00:00:15.222 [Pipeline] echo 00:00:15.224 Sorcerer 10.211.164.20 is alive 00:00:15.231 [Pipeline] retry 00:00:15.233 [Pipeline] { 00:00:15.244 [Pipeline] httpRequest 00:00:15.248 HttpMethod: GET 00:00:15.249 URL: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:15.249 Sending request to url: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:15.271 Response Code: HTTP/1.1 200 OK 00:00:15.271 Success: Status code 200 is in the accepted range: 200,404 00:00:15.272 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:44.561 [Pipeline] } 00:00:44.578 [Pipeline] // retry 00:00:44.586 [Pipeline] sh 00:00:44.875 + tar --no-same-owner -xf spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:47.432 [Pipeline] sh 00:00:47.728 + git -C spdk log --oneline -n5 00:00:47.728 c13c99a5e test: Various fixes for Fedora40 00:00:47.728 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:00:47.728 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:00:47.728 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:00:47.728 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:00:47.766 [Pipeline] } 00:00:47.781 [Pipeline] // stage 00:00:47.790 [Pipeline] stage 00:00:47.793 [Pipeline] { (Prepare) 00:00:47.810 [Pipeline] writeFile 00:00:47.826 [Pipeline] sh 00:00:48.113 + logger -p user.info -t JENKINS-CI 00:00:48.126 [Pipeline] sh 00:00:48.412 + logger -p user.info -t JENKINS-CI 00:00:48.426 [Pipeline] sh 00:00:48.709 + cat autorun-spdk.conf 00:00:48.709 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:48.709 SPDK_TEST_FUZZER_SHORT=1 00:00:48.709 SPDK_TEST_FUZZER=1 00:00:48.709 SPDK_RUN_UBSAN=1 00:00:48.717 RUN_NIGHTLY=1 00:00:48.723 [Pipeline] readFile 00:00:48.757 [Pipeline] withEnv 00:00:48.760 [Pipeline] { 00:00:48.774 [Pipeline] sh 00:00:49.058 + set -ex 00:00:49.058 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:00:49.058 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:49.058 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:49.058 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:49.058 ++ SPDK_TEST_FUZZER=1 00:00:49.058 ++ SPDK_RUN_UBSAN=1 00:00:49.058 ++ RUN_NIGHTLY=1 00:00:49.058 + case $SPDK_TEST_NVMF_NICS in 00:00:49.058 + DRIVERS= 00:00:49.058 + [[ -n '' ]] 00:00:49.058 + exit 0 00:00:49.068 [Pipeline] } 00:00:49.086 [Pipeline] // withEnv 00:00:49.092 [Pipeline] } 00:00:49.108 [Pipeline] // stage 00:00:49.119 [Pipeline] catchError 00:00:49.121 [Pipeline] { 00:00:49.138 [Pipeline] timeout 00:00:49.138 Timeout set to expire in 30 min 00:00:49.141 [Pipeline] { 00:00:49.158 [Pipeline] stage 00:00:49.161 [Pipeline] { (Tests) 00:00:49.176 [Pipeline] sh 00:00:49.459 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:49.459 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:49.459 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:00:49.459 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:00:49.459 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:49.459 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:49.459 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:00:49.459 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:49.459 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:49.459 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:49.459 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:00:49.459 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:49.459 + source /etc/os-release 00:00:49.459 ++ NAME='Fedora Linux' 00:00:49.459 ++ VERSION='39 (Cloud Edition)' 00:00:49.459 ++ ID=fedora 00:00:49.459 ++ VERSION_ID=39 00:00:49.459 ++ VERSION_CODENAME= 00:00:49.459 ++ PLATFORM_ID=platform:f39 00:00:49.459 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:00:49.459 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:49.459 ++ LOGO=fedora-logo-icon 00:00:49.459 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:00:49.459 ++ HOME_URL=https://fedoraproject.org/ 00:00:49.459 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:00:49.459 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:49.459 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:49.459 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:49.459 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:00:49.459 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:49.459 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:00:49.459 ++ SUPPORT_END=2024-11-12 00:00:49.459 ++ VARIANT='Cloud Edition' 00:00:49.460 ++ VARIANT_ID=cloud 00:00:49.460 + uname -a 00:00:49.460 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:00:49.460 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:00:51.994 Hugepages 00:00:51.994 node hugesize free / total 00:00:51.994 node0 1048576kB 0 / 0 00:00:51.994 node0 2048kB 0 / 0 00:00:52.254 node1 1048576kB 0 / 0 00:00:52.254 node1 2048kB 0 / 0 00:00:52.254 00:00:52.254 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:52.254 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:52.254 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:52.254 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:52.254 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:52.254 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:52.254 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:52.254 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:52.254 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:52.254 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:52.254 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:52.254 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:52.254 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:52.254 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:52.254 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:52.254 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:52.254 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:52.254 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:00:52.254 + rm -f /tmp/spdk-ld-path 00:00:52.254 + source autorun-spdk.conf 00:00:52.254 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:52.254 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:52.254 ++ SPDK_TEST_FUZZER=1 00:00:52.254 ++ SPDK_RUN_UBSAN=1 00:00:52.254 ++ RUN_NIGHTLY=1 00:00:52.254 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:52.254 + [[ -n '' ]] 00:00:52.254 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:52.254 + for M in /var/spdk/build-*-manifest.txt 00:00:52.254 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:00:52.254 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:52.254 + for M in /var/spdk/build-*-manifest.txt 00:00:52.254 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:52.254 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:52.254 + for M in /var/spdk/build-*-manifest.txt 00:00:52.254 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:52.254 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:52.254 ++ uname 00:00:52.254 + [[ Linux == \L\i\n\u\x ]] 00:00:52.254 + sudo dmesg -T 00:00:52.513 + sudo dmesg --clear 00:00:52.513 + dmesg_pid=4104895 00:00:52.513 + [[ Fedora Linux == FreeBSD ]] 00:00:52.513 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:52.513 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:52.513 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:52.513 + [[ -x /usr/src/fio-static/fio ]] 00:00:52.513 + export FIO_BIN=/usr/src/fio-static/fio 00:00:52.513 + FIO_BIN=/usr/src/fio-static/fio 00:00:52.513 + sudo dmesg -Tw 00:00:52.513 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:52.513 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:52.513 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:52.513 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:52.513 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:52.513 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:52.513 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:52.513 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:52.513 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:52.513 Test configuration: 00:00:52.513 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:52.513 SPDK_TEST_FUZZER_SHORT=1 00:00:52.513 SPDK_TEST_FUZZER=1 00:00:52.513 SPDK_RUN_UBSAN=1 00:00:52.513 RUN_NIGHTLY=1 06:10:21 -- common/autotest_common.sh@1689 -- $ [[ n == y ]] 00:00:52.513 06:10:21 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:00:52.513 06:10:21 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:52.514 06:10:21 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:52.514 06:10:21 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:52.514 06:10:21 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:52.514 06:10:21 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:52.514 06:10:21 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:52.514 06:10:21 -- paths/export.sh@5 -- $ export PATH 00:00:52.514 06:10:21 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:52.514 06:10:21 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:00:52.514 06:10:21 -- common/autobuild_common.sh@440 -- $ date +%s 00:00:52.514 06:10:21 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1732684221.XXXXXX 00:00:52.514 06:10:21 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1732684221.Gmo4Ct 00:00:52.514 06:10:21 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:00:52.514 06:10:21 -- common/autobuild_common.sh@446 -- $ '[' -n '' ']' 00:00:52.514 06:10:21 -- common/autobuild_common.sh@449 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:00:52.514 06:10:21 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:52.514 06:10:21 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:52.514 06:10:21 -- common/autobuild_common.sh@456 -- $ get_config_params 00:00:52.514 06:10:21 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:00:52.514 06:10:21 -- common/autotest_common.sh@10 -- $ set +x 00:00:52.514 06:10:21 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:00:52.514 06:10:21 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:52.514 06:10:21 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:52.514 06:10:21 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:52.514 06:10:21 -- spdk/autobuild.sh@16 -- $ date -u 00:00:52.514 Wed Nov 27 05:10:21 AM UTC 2024 00:00:52.514 06:10:21 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:52.514 LTS-67-gc13c99a5e 00:00:52.514 06:10:22 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:52.514 06:10:22 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:52.514 06:10:22 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:52.514 06:10:22 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:00:52.514 06:10:22 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:52.514 06:10:22 -- common/autotest_common.sh@10 -- $ set +x 00:00:52.514 ************************************ 00:00:52.514 START TEST ubsan 00:00:52.514 ************************************ 00:00:52.514 06:10:22 -- common/autotest_common.sh@1114 -- $ echo 'using ubsan' 00:00:52.514 using ubsan 00:00:52.514 00:00:52.514 real 0m0.000s 00:00:52.514 user 0m0.000s 00:00:52.514 sys 0m0.000s 00:00:52.514 06:10:22 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:00:52.514 06:10:22 -- common/autotest_common.sh@10 -- $ set +x 00:00:52.514 ************************************ 00:00:52.514 END TEST ubsan 00:00:52.514 ************************************ 00:00:52.514 06:10:22 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:52.514 06:10:22 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:52.514 06:10:22 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:52.514 06:10:22 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:00:52.514 06:10:22 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:00:52.514 06:10:22 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:00:52.514 06:10:22 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:00:52.514 06:10:22 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:52.514 06:10:22 -- common/autotest_common.sh@10 -- $ set +x 00:00:52.773 ************************************ 00:00:52.773 START TEST autobuild_llvm_precompile 00:00:52.773 ************************************ 00:00:52.773 06:10:22 -- common/autotest_common.sh@1114 -- $ _llvm_precompile 00:00:52.773 06:10:22 -- common/autobuild_common.sh@32 -- $ clang --version 00:00:52.773 06:10:22 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:00:52.773 Target: x86_64-redhat-linux-gnu 00:00:52.773 Thread model: posix 00:00:52.773 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:00:52.773 06:10:22 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:00:52.773 06:10:22 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:00:52.773 06:10:22 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:00:52.773 06:10:22 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:00:52.773 06:10:22 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:00:52.773 06:10:22 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:00:52.773 06:10:22 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:00:52.773 06:10:22 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:00:52.773 06:10:22 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:00:52.773 06:10:22 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:00:53.033 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:00:53.033 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:00:53.293 Using 'verbs' RDMA provider 00:01:08.751 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:01:20.962 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:01:21.531 Creating mk/config.mk...done. 00:01:21.531 Creating mk/cc.flags.mk...done. 00:01:21.531 Type 'make' to build. 00:01:21.531 00:01:21.531 real 0m28.809s 00:01:21.531 user 0m12.573s 00:01:21.531 sys 0m15.651s 00:01:21.531 06:10:50 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:21.531 06:10:50 -- common/autotest_common.sh@10 -- $ set +x 00:01:21.531 ************************************ 00:01:21.531 END TEST autobuild_llvm_precompile 00:01:21.531 ************************************ 00:01:21.531 06:10:50 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:21.531 06:10:50 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:21.531 06:10:50 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:21.531 06:10:50 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:21.531 06:10:50 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:21.790 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:21.790 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:22.049 Using 'verbs' RDMA provider 00:01:35.198 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:01:47.408 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:01:47.408 Creating mk/config.mk...done. 00:01:47.408 Creating mk/cc.flags.mk...done. 00:01:47.408 Type 'make' to build. 00:01:47.408 06:11:15 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:01:47.408 06:11:15 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:47.408 06:11:15 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:47.408 06:11:15 -- common/autotest_common.sh@10 -- $ set +x 00:01:47.408 ************************************ 00:01:47.408 START TEST make 00:01:47.408 ************************************ 00:01:47.408 06:11:15 -- common/autotest_common.sh@1114 -- $ make -j112 00:01:47.408 make[1]: Nothing to be done for 'all'. 00:01:47.975 The Meson build system 00:01:47.975 Version: 1.5.0 00:01:47.975 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:01:47.975 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:47.975 Build type: native build 00:01:47.975 Project name: libvfio-user 00:01:47.975 Project version: 0.0.1 00:01:47.975 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:01:47.975 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:01:47.975 Host machine cpu family: x86_64 00:01:47.975 Host machine cpu: x86_64 00:01:47.975 Run-time dependency threads found: YES 00:01:47.975 Library dl found: YES 00:01:47.975 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:47.975 Run-time dependency json-c found: YES 0.17 00:01:47.975 Run-time dependency cmocka found: YES 1.1.7 00:01:47.975 Program pytest-3 found: NO 00:01:47.975 Program flake8 found: NO 00:01:47.975 Program misspell-fixer found: NO 00:01:47.975 Program restructuredtext-lint found: NO 00:01:47.975 Program valgrind found: YES (/usr/bin/valgrind) 00:01:47.975 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:47.975 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:47.975 Compiler for C supports arguments -Wwrite-strings: YES 00:01:47.975 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:47.975 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:01:47.975 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:01:47.975 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:47.975 Build targets in project: 8 00:01:47.975 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:01:47.975 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:01:47.975 00:01:47.975 libvfio-user 0.0.1 00:01:47.975 00:01:47.975 User defined options 00:01:47.975 buildtype : debug 00:01:47.975 default_library: static 00:01:47.975 libdir : /usr/local/lib 00:01:47.975 00:01:47.975 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:48.544 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:48.544 [1/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:01:48.544 [2/36] Compiling C object samples/lspci.p/lspci.c.o 00:01:48.544 [3/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:01:48.544 [4/36] Compiling C object samples/null.p/null.c.o 00:01:48.544 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:01:48.544 [6/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:01:48.544 [7/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:01:48.544 [8/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:01:48.544 [9/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:01:48.544 [10/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:01:48.544 [11/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:01:48.544 [12/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:01:48.544 [13/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:01:48.544 [14/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:01:48.544 [15/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:01:48.544 [16/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:01:48.544 [17/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:01:48.544 [18/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:01:48.544 [19/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:01:48.544 [20/36] Compiling C object test/unit_tests.p/mocks.c.o 00:01:48.544 [21/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:01:48.544 [22/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:01:48.544 [23/36] Compiling C object samples/server.p/server.c.o 00:01:48.544 [24/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:01:48.544 [25/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:01:48.544 [26/36] Compiling C object samples/client.p/client.c.o 00:01:48.544 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:01:48.544 [28/36] Linking static target lib/libvfio-user.a 00:01:48.544 [29/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:01:48.544 [30/36] Linking target samples/client 00:01:48.544 [31/36] Linking target samples/gpio-pci-idio-16 00:01:48.544 [32/36] Linking target samples/shadow_ioeventfd_server 00:01:48.544 [33/36] Linking target samples/null 00:01:48.544 [34/36] Linking target samples/server 00:01:48.544 [35/36] Linking target samples/lspci 00:01:48.544 [36/36] Linking target test/unit_tests 00:01:48.544 INFO: autodetecting backend as ninja 00:01:48.544 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:48.544 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:49.132 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:49.132 ninja: no work to do. 00:01:54.412 The Meson build system 00:01:54.412 Version: 1.5.0 00:01:54.412 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:01:54.412 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:01:54.412 Build type: native build 00:01:54.412 Program cat found: YES (/usr/bin/cat) 00:01:54.412 Project name: DPDK 00:01:54.412 Project version: 23.11.0 00:01:54.412 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:01:54.412 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:01:54.412 Host machine cpu family: x86_64 00:01:54.412 Host machine cpu: x86_64 00:01:54.412 Message: ## Building in Developer Mode ## 00:01:54.412 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:54.412 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:54.412 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:54.412 Program python3 found: YES (/usr/bin/python3) 00:01:54.412 Program cat found: YES (/usr/bin/cat) 00:01:54.412 Compiler for C supports arguments -march=native: YES 00:01:54.412 Checking for size of "void *" : 8 00:01:54.412 Checking for size of "void *" : 8 (cached) 00:01:54.412 Library m found: YES 00:01:54.412 Library numa found: YES 00:01:54.412 Has header "numaif.h" : YES 00:01:54.412 Library fdt found: NO 00:01:54.412 Library execinfo found: NO 00:01:54.412 Has header "execinfo.h" : YES 00:01:54.412 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:54.412 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:54.412 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:54.412 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:54.412 Run-time dependency openssl found: YES 3.1.1 00:01:54.412 Run-time dependency libpcap found: YES 1.10.4 00:01:54.412 Has header "pcap.h" with dependency libpcap: YES 00:01:54.412 Compiler for C supports arguments -Wcast-qual: YES 00:01:54.412 Compiler for C supports arguments -Wdeprecated: YES 00:01:54.412 Compiler for C supports arguments -Wformat: YES 00:01:54.412 Compiler for C supports arguments -Wformat-nonliteral: YES 00:01:54.412 Compiler for C supports arguments -Wformat-security: YES 00:01:54.412 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:54.412 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:54.412 Compiler for C supports arguments -Wnested-externs: YES 00:01:54.412 Compiler for C supports arguments -Wold-style-definition: YES 00:01:54.412 Compiler for C supports arguments -Wpointer-arith: YES 00:01:54.412 Compiler for C supports arguments -Wsign-compare: YES 00:01:54.412 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:54.412 Compiler for C supports arguments -Wundef: YES 00:01:54.412 Compiler for C supports arguments -Wwrite-strings: YES 00:01:54.412 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:54.412 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:01:54.412 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:54.412 Program objdump found: YES (/usr/bin/objdump) 00:01:54.412 Compiler for C supports arguments -mavx512f: YES 00:01:54.412 Checking if "AVX512 checking" compiles: YES 00:01:54.412 Fetching value of define "__SSE4_2__" : 1 00:01:54.412 Fetching value of define "__AES__" : 1 00:01:54.412 Fetching value of define "__AVX__" : 1 00:01:54.412 Fetching value of define "__AVX2__" : 1 00:01:54.412 Fetching value of define "__AVX512BW__" : 1 00:01:54.412 Fetching value of define "__AVX512CD__" : 1 00:01:54.412 Fetching value of define "__AVX512DQ__" : 1 00:01:54.412 Fetching value of define "__AVX512F__" : 1 00:01:54.412 Fetching value of define "__AVX512VL__" : 1 00:01:54.412 Fetching value of define "__PCLMUL__" : 1 00:01:54.412 Fetching value of define "__RDRND__" : 1 00:01:54.412 Fetching value of define "__RDSEED__" : 1 00:01:54.412 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:54.412 Fetching value of define "__znver1__" : (undefined) 00:01:54.412 Fetching value of define "__znver2__" : (undefined) 00:01:54.412 Fetching value of define "__znver3__" : (undefined) 00:01:54.412 Fetching value of define "__znver4__" : (undefined) 00:01:54.412 Compiler for C supports arguments -Wno-format-truncation: NO 00:01:54.412 Message: lib/log: Defining dependency "log" 00:01:54.412 Message: lib/kvargs: Defining dependency "kvargs" 00:01:54.412 Message: lib/telemetry: Defining dependency "telemetry" 00:01:54.412 Checking for function "getentropy" : NO 00:01:54.412 Message: lib/eal: Defining dependency "eal" 00:01:54.412 Message: lib/ring: Defining dependency "ring" 00:01:54.412 Message: lib/rcu: Defining dependency "rcu" 00:01:54.412 Message: lib/mempool: Defining dependency "mempool" 00:01:54.412 Message: lib/mbuf: Defining dependency "mbuf" 00:01:54.412 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:54.412 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:54.413 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:54.413 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:54.413 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:54.413 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:54.413 Compiler for C supports arguments -mpclmul: YES 00:01:54.413 Compiler for C supports arguments -maes: YES 00:01:54.413 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:54.413 Compiler for C supports arguments -mavx512bw: YES 00:01:54.413 Compiler for C supports arguments -mavx512dq: YES 00:01:54.413 Compiler for C supports arguments -mavx512vl: YES 00:01:54.413 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:54.413 Compiler for C supports arguments -mavx2: YES 00:01:54.413 Compiler for C supports arguments -mavx: YES 00:01:54.413 Message: lib/net: Defining dependency "net" 00:01:54.413 Message: lib/meter: Defining dependency "meter" 00:01:54.413 Message: lib/ethdev: Defining dependency "ethdev" 00:01:54.413 Message: lib/pci: Defining dependency "pci" 00:01:54.413 Message: lib/cmdline: Defining dependency "cmdline" 00:01:54.413 Message: lib/hash: Defining dependency "hash" 00:01:54.413 Message: lib/timer: Defining dependency "timer" 00:01:54.413 Message: lib/compressdev: Defining dependency "compressdev" 00:01:54.413 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:54.413 Message: lib/dmadev: Defining dependency "dmadev" 00:01:54.413 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:54.413 Message: lib/power: Defining dependency "power" 00:01:54.413 Message: lib/reorder: Defining dependency "reorder" 00:01:54.413 Message: lib/security: Defining dependency "security" 00:01:54.413 Has header "linux/userfaultfd.h" : YES 00:01:54.413 Has header "linux/vduse.h" : YES 00:01:54.413 Message: lib/vhost: Defining dependency "vhost" 00:01:54.413 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:01:54.413 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:54.413 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:54.413 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:54.413 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:54.413 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:54.413 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:54.413 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:54.413 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:54.413 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:54.413 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:54.413 Configuring doxy-api-html.conf using configuration 00:01:54.413 Configuring doxy-api-man.conf using configuration 00:01:54.413 Program mandb found: YES (/usr/bin/mandb) 00:01:54.413 Program sphinx-build found: NO 00:01:54.413 Configuring rte_build_config.h using configuration 00:01:54.413 Message: 00:01:54.413 ================= 00:01:54.413 Applications Enabled 00:01:54.413 ================= 00:01:54.413 00:01:54.413 apps: 00:01:54.413 00:01:54.413 00:01:54.413 Message: 00:01:54.413 ================= 00:01:54.413 Libraries Enabled 00:01:54.413 ================= 00:01:54.413 00:01:54.413 libs: 00:01:54.413 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:54.413 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:54.413 cryptodev, dmadev, power, reorder, security, vhost, 00:01:54.413 00:01:54.413 Message: 00:01:54.413 =============== 00:01:54.413 Drivers Enabled 00:01:54.413 =============== 00:01:54.413 00:01:54.413 common: 00:01:54.413 00:01:54.413 bus: 00:01:54.413 pci, vdev, 00:01:54.413 mempool: 00:01:54.413 ring, 00:01:54.413 dma: 00:01:54.413 00:01:54.413 net: 00:01:54.413 00:01:54.413 crypto: 00:01:54.413 00:01:54.413 compress: 00:01:54.413 00:01:54.413 vdpa: 00:01:54.413 00:01:54.413 00:01:54.413 Message: 00:01:54.413 ================= 00:01:54.413 Content Skipped 00:01:54.413 ================= 00:01:54.413 00:01:54.413 apps: 00:01:54.413 dumpcap: explicitly disabled via build config 00:01:54.413 graph: explicitly disabled via build config 00:01:54.413 pdump: explicitly disabled via build config 00:01:54.413 proc-info: explicitly disabled via build config 00:01:54.413 test-acl: explicitly disabled via build config 00:01:54.413 test-bbdev: explicitly disabled via build config 00:01:54.413 test-cmdline: explicitly disabled via build config 00:01:54.413 test-compress-perf: explicitly disabled via build config 00:01:54.413 test-crypto-perf: explicitly disabled via build config 00:01:54.413 test-dma-perf: explicitly disabled via build config 00:01:54.413 test-eventdev: explicitly disabled via build config 00:01:54.413 test-fib: explicitly disabled via build config 00:01:54.413 test-flow-perf: explicitly disabled via build config 00:01:54.413 test-gpudev: explicitly disabled via build config 00:01:54.413 test-mldev: explicitly disabled via build config 00:01:54.413 test-pipeline: explicitly disabled via build config 00:01:54.413 test-pmd: explicitly disabled via build config 00:01:54.413 test-regex: explicitly disabled via build config 00:01:54.413 test-sad: explicitly disabled via build config 00:01:54.413 test-security-perf: explicitly disabled via build config 00:01:54.413 00:01:54.413 libs: 00:01:54.413 metrics: explicitly disabled via build config 00:01:54.413 acl: explicitly disabled via build config 00:01:54.413 bbdev: explicitly disabled via build config 00:01:54.413 bitratestats: explicitly disabled via build config 00:01:54.413 bpf: explicitly disabled via build config 00:01:54.413 cfgfile: explicitly disabled via build config 00:01:54.413 distributor: explicitly disabled via build config 00:01:54.413 efd: explicitly disabled via build config 00:01:54.413 eventdev: explicitly disabled via build config 00:01:54.413 dispatcher: explicitly disabled via build config 00:01:54.413 gpudev: explicitly disabled via build config 00:01:54.413 gro: explicitly disabled via build config 00:01:54.413 gso: explicitly disabled via build config 00:01:54.413 ip_frag: explicitly disabled via build config 00:01:54.413 jobstats: explicitly disabled via build config 00:01:54.413 latencystats: explicitly disabled via build config 00:01:54.413 lpm: explicitly disabled via build config 00:01:54.413 member: explicitly disabled via build config 00:01:54.413 pcapng: explicitly disabled via build config 00:01:54.413 rawdev: explicitly disabled via build config 00:01:54.413 regexdev: explicitly disabled via build config 00:01:54.413 mldev: explicitly disabled via build config 00:01:54.413 rib: explicitly disabled via build config 00:01:54.413 sched: explicitly disabled via build config 00:01:54.413 stack: explicitly disabled via build config 00:01:54.413 ipsec: explicitly disabled via build config 00:01:54.413 pdcp: explicitly disabled via build config 00:01:54.413 fib: explicitly disabled via build config 00:01:54.413 port: explicitly disabled via build config 00:01:54.413 pdump: explicitly disabled via build config 00:01:54.413 table: explicitly disabled via build config 00:01:54.413 pipeline: explicitly disabled via build config 00:01:54.413 graph: explicitly disabled via build config 00:01:54.413 node: explicitly disabled via build config 00:01:54.413 00:01:54.413 drivers: 00:01:54.413 common/cpt: not in enabled drivers build config 00:01:54.413 common/dpaax: not in enabled drivers build config 00:01:54.413 common/iavf: not in enabled drivers build config 00:01:54.413 common/idpf: not in enabled drivers build config 00:01:54.413 common/mvep: not in enabled drivers build config 00:01:54.413 common/octeontx: not in enabled drivers build config 00:01:54.413 bus/auxiliary: not in enabled drivers build config 00:01:54.413 bus/cdx: not in enabled drivers build config 00:01:54.413 bus/dpaa: not in enabled drivers build config 00:01:54.413 bus/fslmc: not in enabled drivers build config 00:01:54.413 bus/ifpga: not in enabled drivers build config 00:01:54.413 bus/platform: not in enabled drivers build config 00:01:54.413 bus/vmbus: not in enabled drivers build config 00:01:54.413 common/cnxk: not in enabled drivers build config 00:01:54.413 common/mlx5: not in enabled drivers build config 00:01:54.413 common/nfp: not in enabled drivers build config 00:01:54.413 common/qat: not in enabled drivers build config 00:01:54.413 common/sfc_efx: not in enabled drivers build config 00:01:54.413 mempool/bucket: not in enabled drivers build config 00:01:54.413 mempool/cnxk: not in enabled drivers build config 00:01:54.413 mempool/dpaa: not in enabled drivers build config 00:01:54.413 mempool/dpaa2: not in enabled drivers build config 00:01:54.413 mempool/octeontx: not in enabled drivers build config 00:01:54.413 mempool/stack: not in enabled drivers build config 00:01:54.413 dma/cnxk: not in enabled drivers build config 00:01:54.413 dma/dpaa: not in enabled drivers build config 00:01:54.413 dma/dpaa2: not in enabled drivers build config 00:01:54.413 dma/hisilicon: not in enabled drivers build config 00:01:54.413 dma/idxd: not in enabled drivers build config 00:01:54.413 dma/ioat: not in enabled drivers build config 00:01:54.413 dma/skeleton: not in enabled drivers build config 00:01:54.413 net/af_packet: not in enabled drivers build config 00:01:54.413 net/af_xdp: not in enabled drivers build config 00:01:54.413 net/ark: not in enabled drivers build config 00:01:54.413 net/atlantic: not in enabled drivers build config 00:01:54.413 net/avp: not in enabled drivers build config 00:01:54.413 net/axgbe: not in enabled drivers build config 00:01:54.413 net/bnx2x: not in enabled drivers build config 00:01:54.413 net/bnxt: not in enabled drivers build config 00:01:54.413 net/bonding: not in enabled drivers build config 00:01:54.413 net/cnxk: not in enabled drivers build config 00:01:54.413 net/cpfl: not in enabled drivers build config 00:01:54.413 net/cxgbe: not in enabled drivers build config 00:01:54.413 net/dpaa: not in enabled drivers build config 00:01:54.413 net/dpaa2: not in enabled drivers build config 00:01:54.413 net/e1000: not in enabled drivers build config 00:01:54.413 net/ena: not in enabled drivers build config 00:01:54.413 net/enetc: not in enabled drivers build config 00:01:54.413 net/enetfec: not in enabled drivers build config 00:01:54.413 net/enic: not in enabled drivers build config 00:01:54.413 net/failsafe: not in enabled drivers build config 00:01:54.414 net/fm10k: not in enabled drivers build config 00:01:54.414 net/gve: not in enabled drivers build config 00:01:54.414 net/hinic: not in enabled drivers build config 00:01:54.414 net/hns3: not in enabled drivers build config 00:01:54.414 net/i40e: not in enabled drivers build config 00:01:54.414 net/iavf: not in enabled drivers build config 00:01:54.414 net/ice: not in enabled drivers build config 00:01:54.414 net/idpf: not in enabled drivers build config 00:01:54.414 net/igc: not in enabled drivers build config 00:01:54.414 net/ionic: not in enabled drivers build config 00:01:54.414 net/ipn3ke: not in enabled drivers build config 00:01:54.414 net/ixgbe: not in enabled drivers build config 00:01:54.414 net/mana: not in enabled drivers build config 00:01:54.414 net/memif: not in enabled drivers build config 00:01:54.414 net/mlx4: not in enabled drivers build config 00:01:54.414 net/mlx5: not in enabled drivers build config 00:01:54.414 net/mvneta: not in enabled drivers build config 00:01:54.414 net/mvpp2: not in enabled drivers build config 00:01:54.414 net/netvsc: not in enabled drivers build config 00:01:54.414 net/nfb: not in enabled drivers build config 00:01:54.414 net/nfp: not in enabled drivers build config 00:01:54.414 net/ngbe: not in enabled drivers build config 00:01:54.414 net/null: not in enabled drivers build config 00:01:54.414 net/octeontx: not in enabled drivers build config 00:01:54.414 net/octeon_ep: not in enabled drivers build config 00:01:54.414 net/pcap: not in enabled drivers build config 00:01:54.414 net/pfe: not in enabled drivers build config 00:01:54.414 net/qede: not in enabled drivers build config 00:01:54.414 net/ring: not in enabled drivers build config 00:01:54.414 net/sfc: not in enabled drivers build config 00:01:54.414 net/softnic: not in enabled drivers build config 00:01:54.414 net/tap: not in enabled drivers build config 00:01:54.414 net/thunderx: not in enabled drivers build config 00:01:54.414 net/txgbe: not in enabled drivers build config 00:01:54.414 net/vdev_netvsc: not in enabled drivers build config 00:01:54.414 net/vhost: not in enabled drivers build config 00:01:54.414 net/virtio: not in enabled drivers build config 00:01:54.414 net/vmxnet3: not in enabled drivers build config 00:01:54.414 raw/*: missing internal dependency, "rawdev" 00:01:54.414 crypto/armv8: not in enabled drivers build config 00:01:54.414 crypto/bcmfs: not in enabled drivers build config 00:01:54.414 crypto/caam_jr: not in enabled drivers build config 00:01:54.414 crypto/ccp: not in enabled drivers build config 00:01:54.414 crypto/cnxk: not in enabled drivers build config 00:01:54.414 crypto/dpaa_sec: not in enabled drivers build config 00:01:54.414 crypto/dpaa2_sec: not in enabled drivers build config 00:01:54.414 crypto/ipsec_mb: not in enabled drivers build config 00:01:54.414 crypto/mlx5: not in enabled drivers build config 00:01:54.414 crypto/mvsam: not in enabled drivers build config 00:01:54.414 crypto/nitrox: not in enabled drivers build config 00:01:54.414 crypto/null: not in enabled drivers build config 00:01:54.414 crypto/octeontx: not in enabled drivers build config 00:01:54.414 crypto/openssl: not in enabled drivers build config 00:01:54.414 crypto/scheduler: not in enabled drivers build config 00:01:54.414 crypto/uadk: not in enabled drivers build config 00:01:54.414 crypto/virtio: not in enabled drivers build config 00:01:54.414 compress/isal: not in enabled drivers build config 00:01:54.414 compress/mlx5: not in enabled drivers build config 00:01:54.414 compress/octeontx: not in enabled drivers build config 00:01:54.414 compress/zlib: not in enabled drivers build config 00:01:54.414 regex/*: missing internal dependency, "regexdev" 00:01:54.414 ml/*: missing internal dependency, "mldev" 00:01:54.414 vdpa/ifc: not in enabled drivers build config 00:01:54.414 vdpa/mlx5: not in enabled drivers build config 00:01:54.414 vdpa/nfp: not in enabled drivers build config 00:01:54.414 vdpa/sfc: not in enabled drivers build config 00:01:54.414 event/*: missing internal dependency, "eventdev" 00:01:54.414 baseband/*: missing internal dependency, "bbdev" 00:01:54.414 gpu/*: missing internal dependency, "gpudev" 00:01:54.414 00:01:54.414 00:01:54.414 Build targets in project: 85 00:01:54.414 00:01:54.414 DPDK 23.11.0 00:01:54.414 00:01:54.414 User defined options 00:01:54.414 buildtype : debug 00:01:54.414 default_library : static 00:01:54.414 libdir : lib 00:01:54.414 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:54.414 c_args : -fPIC -Werror 00:01:54.414 c_link_args : 00:01:54.414 cpu_instruction_set: native 00:01:54.414 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:01:54.414 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,pcapng,bbdev 00:01:54.414 enable_docs : false 00:01:54.414 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:01:54.414 enable_kmods : false 00:01:54.414 tests : false 00:01:54.414 00:01:54.414 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:54.678 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:01:54.678 [1/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:54.678 [2/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:54.678 [3/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:54.678 [4/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:54.678 [5/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:54.678 [6/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:54.678 [7/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:54.679 [8/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:54.679 [9/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:54.679 [10/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:54.679 [11/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:54.679 [12/265] Linking static target lib/librte_kvargs.a 00:01:54.679 [13/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:54.679 [14/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:54.679 [15/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:54.679 [16/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:54.679 [17/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:54.679 [18/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:54.679 [19/265] Linking static target lib/librte_log.a 00:01:54.679 [20/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:54.679 [21/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:54.679 [22/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:54.679 [23/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:54.679 [24/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:54.679 [25/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:54.679 [26/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:54.679 [27/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:54.679 [28/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:54.944 [29/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:54.944 [30/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:54.944 [31/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:54.944 [32/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:54.944 [33/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:54.944 [34/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:54.944 [35/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:54.944 [36/265] Linking static target lib/librte_pci.a 00:01:54.944 [37/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:54.944 [38/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:54.944 [39/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:54.944 [40/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:54.944 [41/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:55.203 [42/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.203 [43/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.203 [44/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:55.203 [45/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:55.203 [46/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:55.203 [47/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:55.203 [48/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:55.203 [49/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:55.203 [50/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:55.203 [51/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:55.203 [52/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:55.203 [53/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:55.203 [54/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:55.203 [55/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:55.203 [56/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:55.203 [57/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:55.203 [58/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:55.203 [59/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:55.203 [60/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:55.203 [61/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:55.203 [62/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:55.203 [63/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:55.203 [64/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:55.203 [65/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:55.203 [66/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:55.203 [67/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:55.203 [68/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:55.203 [69/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:55.203 [70/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:55.203 [71/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:55.203 [72/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:55.203 [73/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:55.203 [74/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:55.203 [75/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:55.203 [76/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:55.203 [77/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:55.203 [78/265] Linking static target lib/librte_telemetry.a 00:01:55.203 [79/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:55.203 [80/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:55.203 [81/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:55.203 [82/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:55.203 [83/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:55.203 [84/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:55.203 [85/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:55.203 [86/265] Linking static target lib/librte_meter.a 00:01:55.203 [87/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:55.203 [88/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:55.203 [89/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:55.203 [90/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:55.203 [91/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:55.462 [92/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:55.462 [93/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:55.462 [94/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:55.462 [95/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:55.462 [96/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:55.462 [97/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:55.462 [98/265] Linking static target lib/librte_ring.a 00:01:55.462 [99/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:55.462 [100/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:55.462 [101/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:55.462 [102/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:55.462 [103/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:55.462 [104/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:55.462 [105/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:55.462 [106/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:55.462 [107/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:55.462 [108/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.462 [109/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:55.462 [110/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:55.462 [111/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:55.462 [112/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:55.462 [113/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:55.462 [114/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:55.462 [115/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:55.462 [116/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:55.462 [117/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:55.462 [118/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:55.462 [119/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:55.462 [120/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:55.462 [121/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:55.462 [122/265] Linking static target lib/librte_timer.a 00:01:55.462 [123/265] Linking static target lib/librte_cmdline.a 00:01:55.462 [124/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:55.462 [125/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:55.462 [126/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:55.462 [127/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:55.462 [128/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:55.462 [129/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:55.462 [130/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:55.462 [131/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:55.462 [132/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:55.462 [133/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:55.462 [134/265] Linking static target lib/librte_mempool.a 00:01:55.462 [135/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:55.462 [136/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:55.462 [137/265] Linking static target lib/librte_rcu.a 00:01:55.462 [138/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:55.462 [139/265] Linking target lib/librte_log.so.24.0 00:01:55.462 [140/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:55.462 [141/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:55.462 [142/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:55.462 [143/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:55.462 [144/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:55.462 [145/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:55.462 [146/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:55.462 [147/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:55.462 [148/265] Linking static target lib/librte_compressdev.a 00:01:55.462 [149/265] Linking static target lib/librte_dmadev.a 00:01:55.462 [150/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:55.462 [151/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:55.462 [152/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:55.462 [153/265] Linking static target lib/librte_net.a 00:01:55.462 [154/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:55.462 [155/265] Linking static target lib/librte_mbuf.a 00:01:55.462 [156/265] Linking static target lib/librte_eal.a 00:01:55.462 [157/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:55.462 [158/265] Linking static target lib/librte_power.a 00:01:55.462 [159/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:55.462 [160/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:55.462 [161/265] Linking static target lib/librte_reorder.a 00:01:55.462 [162/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:55.462 [163/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:55.462 [164/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:55.462 [165/265] Linking target lib/librte_kvargs.so.24.0 00:01:55.462 [166/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:55.721 [167/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:55.721 [168/265] Linking static target lib/librte_security.a 00:01:55.721 [169/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:55.721 [170/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.721 [171/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:55.721 [172/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:55.721 [173/265] Linking static target lib/librte_hash.a 00:01:55.721 [174/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:55.721 [175/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:55.721 [176/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.721 [177/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:55.721 [178/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:55.721 [179/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:55.721 [180/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:55.721 [181/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:55.721 [182/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:55.721 [183/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:55.721 [184/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:55.721 [185/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:55.721 [186/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:55.721 [187/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:55.721 [188/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:55.721 [189/265] Linking static target lib/librte_cryptodev.a 00:01:55.721 [190/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:55.721 [191/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:55.721 [192/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.721 [193/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:55.721 [194/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.721 [195/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:55.981 [196/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.981 [197/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.981 [198/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:55.981 [199/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:55.981 [200/265] Linking static target drivers/librte_bus_vdev.a 00:01:55.981 [201/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:55.981 [202/265] Linking target lib/librte_telemetry.so.24.0 00:01:55.981 [203/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:55.981 [204/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:55.981 [205/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:55.981 [206/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:55.981 [207/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:55.981 [208/265] Linking static target lib/librte_ethdev.a 00:01:55.981 [209/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.981 [210/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:55.981 [211/265] Linking static target drivers/librte_bus_pci.a 00:01:55.981 [212/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:55.981 [213/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.981 [214/265] Linking static target drivers/librte_mempool_ring.a 00:01:55.981 [215/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:56.240 [216/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.240 [217/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.240 [218/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.240 [219/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.498 [220/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.498 [221/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.498 [222/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.757 [223/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:56.757 [224/265] Linking static target lib/librte_vhost.a 00:01:56.757 [225/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.757 [226/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.137 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.073 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.638 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.168 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.426 [231/265] Linking target lib/librte_eal.so.24.0 00:02:08.426 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:08.684 [233/265] Linking target lib/librte_dmadev.so.24.0 00:02:08.684 [234/265] Linking target lib/librte_meter.so.24.0 00:02:08.684 [235/265] Linking target lib/librte_ring.so.24.0 00:02:08.684 [236/265] Linking target drivers/librte_bus_vdev.so.24.0 00:02:08.684 [237/265] Linking target lib/librte_timer.so.24.0 00:02:08.684 [238/265] Linking target lib/librte_pci.so.24.0 00:02:08.684 [239/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:08.684 [240/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:08.684 [241/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:08.684 [242/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:08.684 [243/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:08.684 [244/265] Linking target drivers/librte_bus_pci.so.24.0 00:02:08.684 [245/265] Linking target lib/librte_rcu.so.24.0 00:02:08.684 [246/265] Linking target lib/librte_mempool.so.24.0 00:02:08.942 [247/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:08.942 [248/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:08.942 [249/265] Linking target lib/librte_mbuf.so.24.0 00:02:08.942 [250/265] Linking target drivers/librte_mempool_ring.so.24.0 00:02:09.199 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:09.199 [252/265] Linking target lib/librte_reorder.so.24.0 00:02:09.199 [253/265] Linking target lib/librte_net.so.24.0 00:02:09.199 [254/265] Linking target lib/librte_compressdev.so.24.0 00:02:09.199 [255/265] Linking target lib/librte_cryptodev.so.24.0 00:02:09.199 [256/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:09.199 [257/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:09.457 [258/265] Linking target lib/librte_hash.so.24.0 00:02:09.457 [259/265] Linking target lib/librte_cmdline.so.24.0 00:02:09.457 [260/265] Linking target lib/librte_security.so.24.0 00:02:09.457 [261/265] Linking target lib/librte_ethdev.so.24.0 00:02:09.457 [262/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:09.457 [263/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:09.715 [264/265] Linking target lib/librte_vhost.so.24.0 00:02:09.715 [265/265] Linking target lib/librte_power.so.24.0 00:02:09.715 INFO: autodetecting backend as ninja 00:02:09.715 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 112 00:02:10.646 CC lib/ut/ut.o 00:02:10.646 CC lib/log/log.o 00:02:10.646 CC lib/log/log_flags.o 00:02:10.646 CC lib/ut_mock/mock.o 00:02:10.646 CC lib/log/log_deprecated.o 00:02:10.646 LIB libspdk_ut.a 00:02:10.646 LIB libspdk_ut_mock.a 00:02:10.646 LIB libspdk_log.a 00:02:10.904 CC lib/ioat/ioat.o 00:02:11.162 CXX lib/trace_parser/trace.o 00:02:11.162 CC lib/util/base64.o 00:02:11.162 CC lib/util/bit_array.o 00:02:11.162 CC lib/util/cpuset.o 00:02:11.162 CC lib/util/crc32c.o 00:02:11.162 CC lib/dma/dma.o 00:02:11.162 CC lib/util/crc16.o 00:02:11.162 CC lib/util/crc32.o 00:02:11.162 CC lib/util/crc32_ieee.o 00:02:11.162 CC lib/util/fd.o 00:02:11.162 CC lib/util/crc64.o 00:02:11.162 CC lib/util/dif.o 00:02:11.162 CC lib/util/file.o 00:02:11.162 CC lib/util/hexlify.o 00:02:11.162 CC lib/util/iov.o 00:02:11.162 CC lib/util/math.o 00:02:11.162 CC lib/util/pipe.o 00:02:11.162 CC lib/util/strerror_tls.o 00:02:11.162 CC lib/util/fd_group.o 00:02:11.162 CC lib/util/string.o 00:02:11.162 CC lib/util/uuid.o 00:02:11.162 CC lib/util/xor.o 00:02:11.162 CC lib/util/zipf.o 00:02:11.162 CC lib/vfio_user/host/vfio_user_pci.o 00:02:11.162 CC lib/vfio_user/host/vfio_user.o 00:02:11.162 LIB libspdk_dma.a 00:02:11.162 LIB libspdk_ioat.a 00:02:11.419 LIB libspdk_vfio_user.a 00:02:11.419 LIB libspdk_util.a 00:02:11.419 LIB libspdk_trace_parser.a 00:02:11.677 CC lib/rdma/rdma_verbs.o 00:02:11.677 CC lib/rdma/common.o 00:02:11.677 CC lib/conf/conf.o 00:02:11.677 CC lib/vmd/vmd.o 00:02:11.677 CC lib/vmd/led.o 00:02:11.677 CC lib/json/json_parse.o 00:02:11.677 CC lib/json/json_util.o 00:02:11.677 CC lib/env_dpdk/pci.o 00:02:11.677 CC lib/json/json_write.o 00:02:11.677 CC lib/env_dpdk/env.o 00:02:11.677 CC lib/env_dpdk/memory.o 00:02:11.677 CC lib/idxd/idxd.o 00:02:11.677 CC lib/idxd/idxd_user.o 00:02:11.677 CC lib/env_dpdk/init.o 00:02:11.677 CC lib/idxd/idxd_kernel.o 00:02:11.677 CC lib/env_dpdk/threads.o 00:02:11.677 CC lib/env_dpdk/pci_ioat.o 00:02:11.677 CC lib/env_dpdk/pci_virtio.o 00:02:11.677 CC lib/env_dpdk/pci_vmd.o 00:02:11.677 CC lib/env_dpdk/pci_idxd.o 00:02:11.677 CC lib/env_dpdk/pci_event.o 00:02:11.677 CC lib/env_dpdk/sigbus_handler.o 00:02:11.677 CC lib/env_dpdk/pci_dpdk.o 00:02:11.677 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:11.677 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:11.936 LIB libspdk_conf.a 00:02:11.936 LIB libspdk_rdma.a 00:02:11.936 LIB libspdk_json.a 00:02:11.936 LIB libspdk_idxd.a 00:02:11.936 LIB libspdk_vmd.a 00:02:12.195 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:12.195 CC lib/jsonrpc/jsonrpc_server.o 00:02:12.195 CC lib/jsonrpc/jsonrpc_client.o 00:02:12.195 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:12.195 LIB libspdk_jsonrpc.a 00:02:12.453 CC lib/rpc/rpc.o 00:02:12.453 LIB libspdk_env_dpdk.a 00:02:12.711 LIB libspdk_rpc.a 00:02:12.970 CC lib/trace/trace.o 00:02:12.970 CC lib/trace/trace_flags.o 00:02:12.970 CC lib/trace/trace_rpc.o 00:02:12.970 CC lib/notify/notify_rpc.o 00:02:12.970 CC lib/notify/notify.o 00:02:12.970 CC lib/sock/sock.o 00:02:12.970 CC lib/sock/sock_rpc.o 00:02:13.228 LIB libspdk_notify.a 00:02:13.228 LIB libspdk_trace.a 00:02:13.228 LIB libspdk_sock.a 00:02:13.488 CC lib/thread/thread.o 00:02:13.488 CC lib/thread/iobuf.o 00:02:13.488 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:13.488 CC lib/nvme/nvme_ctrlr.o 00:02:13.488 CC lib/nvme/nvme_ns.o 00:02:13.488 CC lib/nvme/nvme_fabric.o 00:02:13.488 CC lib/nvme/nvme_ns_cmd.o 00:02:13.488 CC lib/nvme/nvme_pcie_common.o 00:02:13.488 CC lib/nvme/nvme_quirks.o 00:02:13.488 CC lib/nvme/nvme_pcie.o 00:02:13.488 CC lib/nvme/nvme.o 00:02:13.488 CC lib/nvme/nvme_qpair.o 00:02:13.488 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:13.488 CC lib/nvme/nvme_transport.o 00:02:13.488 CC lib/nvme/nvme_discovery.o 00:02:13.488 CC lib/nvme/nvme_tcp.o 00:02:13.488 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:13.488 CC lib/nvme/nvme_opal.o 00:02:13.488 CC lib/nvme/nvme_io_msg.o 00:02:13.488 CC lib/nvme/nvme_poll_group.o 00:02:13.488 CC lib/nvme/nvme_zns.o 00:02:13.488 CC lib/nvme/nvme_cuse.o 00:02:13.488 CC lib/nvme/nvme_vfio_user.o 00:02:13.488 CC lib/nvme/nvme_rdma.o 00:02:14.433 LIB libspdk_thread.a 00:02:14.433 CC lib/virtio/virtio.o 00:02:14.433 CC lib/vfu_tgt/tgt_endpoint.o 00:02:14.433 CC lib/virtio/virtio_vhost_user.o 00:02:14.433 CC lib/vfu_tgt/tgt_rpc.o 00:02:14.433 CC lib/virtio/virtio_vfio_user.o 00:02:14.433 CC lib/virtio/virtio_pci.o 00:02:14.433 CC lib/accel/accel_sw.o 00:02:14.433 CC lib/accel/accel.o 00:02:14.433 CC lib/accel/accel_rpc.o 00:02:14.433 CC lib/blob/blobstore.o 00:02:14.433 CC lib/blob/request.o 00:02:14.433 CC lib/blob/zeroes.o 00:02:14.433 CC lib/blob/blob_bs_dev.o 00:02:14.433 CC lib/init/subsystem_rpc.o 00:02:14.433 CC lib/init/json_config.o 00:02:14.433 CC lib/init/subsystem.o 00:02:14.433 CC lib/init/rpc.o 00:02:14.692 LIB libspdk_virtio.a 00:02:14.692 LIB libspdk_init.a 00:02:14.692 LIB libspdk_vfu_tgt.a 00:02:14.692 LIB libspdk_nvme.a 00:02:14.951 CC lib/event/app_rpc.o 00:02:14.951 CC lib/event/app.o 00:02:14.951 CC lib/event/reactor.o 00:02:14.951 CC lib/event/log_rpc.o 00:02:14.951 CC lib/event/scheduler_static.o 00:02:15.210 LIB libspdk_accel.a 00:02:15.210 LIB libspdk_event.a 00:02:15.469 CC lib/bdev/bdev.o 00:02:15.469 CC lib/bdev/part.o 00:02:15.469 CC lib/bdev/bdev_rpc.o 00:02:15.469 CC lib/bdev/bdev_zone.o 00:02:15.469 CC lib/bdev/scsi_nvme.o 00:02:16.036 LIB libspdk_blob.a 00:02:16.297 CC lib/lvol/lvol.o 00:02:16.297 CC lib/blobfs/blobfs.o 00:02:16.297 CC lib/blobfs/tree.o 00:02:16.866 LIB libspdk_lvol.a 00:02:16.866 LIB libspdk_blobfs.a 00:02:17.179 LIB libspdk_bdev.a 00:02:17.479 CC lib/nbd/nbd.o 00:02:17.479 CC lib/nbd/nbd_rpc.o 00:02:17.479 CC lib/ublk/ublk.o 00:02:17.479 CC lib/ublk/ublk_rpc.o 00:02:17.479 CC lib/nvmf/ctrlr.o 00:02:17.479 CC lib/nvmf/subsystem.o 00:02:17.480 CC lib/nvmf/ctrlr_discovery.o 00:02:17.480 CC lib/nvmf/nvmf.o 00:02:17.480 CC lib/scsi/lun.o 00:02:17.480 CC lib/nvmf/ctrlr_bdev.o 00:02:17.480 CC lib/scsi/dev.o 00:02:17.480 CC lib/ftl/ftl_core.o 00:02:17.480 CC lib/nvmf/transport.o 00:02:17.480 CC lib/scsi/port.o 00:02:17.480 CC lib/ftl/ftl_init.o 00:02:17.480 CC lib/nvmf/nvmf_rpc.o 00:02:17.480 CC lib/ftl/ftl_debug.o 00:02:17.480 CC lib/ftl/ftl_layout.o 00:02:17.480 CC lib/scsi/scsi.o 00:02:17.480 CC lib/ftl/ftl_sb.o 00:02:17.480 CC lib/scsi/scsi_bdev.o 00:02:17.480 CC lib/nvmf/tcp.o 00:02:17.480 CC lib/ftl/ftl_io.o 00:02:17.480 CC lib/scsi/scsi_pr.o 00:02:17.480 CC lib/nvmf/vfio_user.o 00:02:17.480 CC lib/scsi/scsi_rpc.o 00:02:17.480 CC lib/nvmf/rdma.o 00:02:17.480 CC lib/ftl/ftl_l2p.o 00:02:17.480 CC lib/scsi/task.o 00:02:17.480 CC lib/ftl/ftl_l2p_flat.o 00:02:17.480 CC lib/ftl/ftl_nv_cache.o 00:02:17.480 CC lib/ftl/ftl_band.o 00:02:17.480 CC lib/ftl/ftl_band_ops.o 00:02:17.480 CC lib/ftl/ftl_writer.o 00:02:17.480 CC lib/ftl/ftl_rq.o 00:02:17.480 CC lib/ftl/ftl_reloc.o 00:02:17.480 CC lib/ftl/ftl_l2p_cache.o 00:02:17.480 CC lib/ftl/ftl_p2l.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:17.480 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:17.480 CC lib/ftl/utils/ftl_md.o 00:02:17.480 CC lib/ftl/utils/ftl_mempool.o 00:02:17.480 CC lib/ftl/utils/ftl_conf.o 00:02:17.480 CC lib/ftl/utils/ftl_bitmap.o 00:02:17.480 CC lib/ftl/utils/ftl_property.o 00:02:17.480 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:17.480 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:17.480 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:17.480 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:17.480 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:17.480 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:17.480 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:17.480 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:17.480 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:17.480 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:17.480 CC lib/ftl/base/ftl_base_dev.o 00:02:17.480 CC lib/ftl/ftl_trace.o 00:02:17.480 CC lib/ftl/base/ftl_base_bdev.o 00:02:17.762 LIB libspdk_nbd.a 00:02:17.762 LIB libspdk_scsi.a 00:02:17.762 LIB libspdk_ublk.a 00:02:18.033 LIB libspdk_ftl.a 00:02:18.033 CC lib/vhost/vhost_rpc.o 00:02:18.033 CC lib/vhost/vhost.o 00:02:18.033 CC lib/vhost/vhost_scsi.o 00:02:18.033 CC lib/vhost/vhost_blk.o 00:02:18.033 CC lib/vhost/rte_vhost_user.o 00:02:18.033 CC lib/iscsi/init_grp.o 00:02:18.033 CC lib/iscsi/conn.o 00:02:18.033 CC lib/iscsi/md5.o 00:02:18.033 CC lib/iscsi/iscsi.o 00:02:18.033 CC lib/iscsi/portal_grp.o 00:02:18.033 CC lib/iscsi/param.o 00:02:18.033 CC lib/iscsi/tgt_node.o 00:02:18.033 CC lib/iscsi/iscsi_subsystem.o 00:02:18.033 CC lib/iscsi/iscsi_rpc.o 00:02:18.033 CC lib/iscsi/task.o 00:02:18.602 LIB libspdk_nvmf.a 00:02:18.602 LIB libspdk_vhost.a 00:02:18.861 LIB libspdk_iscsi.a 00:02:19.120 CC module/env_dpdk/env_dpdk_rpc.o 00:02:19.120 CC module/vfu_device/vfu_virtio_blk.o 00:02:19.120 CC module/vfu_device/vfu_virtio.o 00:02:19.120 CC module/vfu_device/vfu_virtio_scsi.o 00:02:19.120 CC module/vfu_device/vfu_virtio_rpc.o 00:02:19.378 LIB libspdk_env_dpdk_rpc.a 00:02:19.378 CC module/scheduler/gscheduler/gscheduler.o 00:02:19.378 CC module/blob/bdev/blob_bdev.o 00:02:19.378 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:19.378 CC module/accel/ioat/accel_ioat_rpc.o 00:02:19.378 CC module/accel/ioat/accel_ioat.o 00:02:19.378 CC module/sock/posix/posix.o 00:02:19.378 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:19.378 CC module/accel/dsa/accel_dsa.o 00:02:19.378 CC module/accel/dsa/accel_dsa_rpc.o 00:02:19.378 CC module/accel/iaa/accel_iaa.o 00:02:19.378 CC module/accel/iaa/accel_iaa_rpc.o 00:02:19.378 CC module/accel/error/accel_error.o 00:02:19.378 CC module/accel/error/accel_error_rpc.o 00:02:19.378 LIB libspdk_scheduler_gscheduler.a 00:02:19.378 LIB libspdk_scheduler_dpdk_governor.a 00:02:19.637 LIB libspdk_scheduler_dynamic.a 00:02:19.637 LIB libspdk_accel_ioat.a 00:02:19.637 LIB libspdk_accel_error.a 00:02:19.637 LIB libspdk_accel_iaa.a 00:02:19.637 LIB libspdk_blob_bdev.a 00:02:19.637 LIB libspdk_accel_dsa.a 00:02:19.637 LIB libspdk_vfu_device.a 00:02:19.894 LIB libspdk_sock_posix.a 00:02:19.894 CC module/bdev/gpt/gpt.o 00:02:19.894 CC module/bdev/gpt/vbdev_gpt.o 00:02:19.894 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:19.894 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:19.894 CC module/bdev/lvol/vbdev_lvol.o 00:02:19.894 CC module/bdev/delay/vbdev_delay.o 00:02:19.894 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:19.894 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:19.894 CC module/bdev/passthru/vbdev_passthru.o 00:02:19.894 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:19.894 CC module/blobfs/bdev/blobfs_bdev.o 00:02:19.894 CC module/bdev/split/vbdev_split_rpc.o 00:02:19.894 CC module/bdev/split/vbdev_split.o 00:02:19.894 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:19.894 CC module/bdev/error/vbdev_error.o 00:02:19.894 CC module/bdev/error/vbdev_error_rpc.o 00:02:19.894 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:19.894 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:19.894 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:19.895 CC module/bdev/raid/bdev_raid.o 00:02:19.895 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:19.895 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:19.895 CC module/bdev/iscsi/bdev_iscsi.o 00:02:19.895 CC module/bdev/raid/bdev_raid_rpc.o 00:02:19.895 CC module/bdev/malloc/bdev_malloc.o 00:02:19.895 CC module/bdev/raid/bdev_raid_sb.o 00:02:19.895 CC module/bdev/nvme/bdev_nvme.o 00:02:19.895 CC module/bdev/raid/raid0.o 00:02:19.895 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:19.895 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:19.895 CC module/bdev/aio/bdev_aio.o 00:02:19.895 CC module/bdev/nvme/nvme_rpc.o 00:02:19.895 CC module/bdev/raid/raid1.o 00:02:19.895 CC module/bdev/nvme/bdev_mdns_client.o 00:02:19.895 CC module/bdev/raid/concat.o 00:02:19.895 CC module/bdev/aio/bdev_aio_rpc.o 00:02:19.895 CC module/bdev/ftl/bdev_ftl.o 00:02:19.895 CC module/bdev/nvme/vbdev_opal.o 00:02:19.895 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:19.895 CC module/bdev/null/bdev_null_rpc.o 00:02:19.895 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:19.895 CC module/bdev/null/bdev_null.o 00:02:20.153 LIB libspdk_blobfs_bdev.a 00:02:20.153 LIB libspdk_bdev_split.a 00:02:20.153 LIB libspdk_bdev_gpt.a 00:02:20.153 LIB libspdk_bdev_error.a 00:02:20.153 LIB libspdk_bdev_passthru.a 00:02:20.153 LIB libspdk_bdev_null.a 00:02:20.153 LIB libspdk_bdev_ftl.a 00:02:20.153 LIB libspdk_bdev_zone_block.a 00:02:20.153 LIB libspdk_bdev_delay.a 00:02:20.153 LIB libspdk_bdev_aio.a 00:02:20.153 LIB libspdk_bdev_iscsi.a 00:02:20.153 LIB libspdk_bdev_malloc.a 00:02:20.413 LIB libspdk_bdev_lvol.a 00:02:20.413 LIB libspdk_bdev_virtio.a 00:02:20.413 LIB libspdk_bdev_raid.a 00:02:21.351 LIB libspdk_bdev_nvme.a 00:02:21.610 CC module/event/subsystems/scheduler/scheduler.o 00:02:21.610 CC module/event/subsystems/vmd/vmd.o 00:02:21.610 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:21.610 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:21.610 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:21.610 CC module/event/subsystems/iobuf/iobuf.o 00:02:21.610 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:21.869 CC module/event/subsystems/sock/sock.o 00:02:21.869 LIB libspdk_event_scheduler.a 00:02:21.869 LIB libspdk_event_vfu_tgt.a 00:02:21.869 LIB libspdk_event_vhost_blk.a 00:02:21.869 LIB libspdk_event_vmd.a 00:02:21.869 LIB libspdk_event_sock.a 00:02:21.869 LIB libspdk_event_iobuf.a 00:02:22.128 CC module/event/subsystems/accel/accel.o 00:02:22.387 LIB libspdk_event_accel.a 00:02:22.646 CC module/event/subsystems/bdev/bdev.o 00:02:22.646 LIB libspdk_event_bdev.a 00:02:22.905 CC module/event/subsystems/nbd/nbd.o 00:02:22.905 CC module/event/subsystems/ublk/ublk.o 00:02:22.905 CC module/event/subsystems/scsi/scsi.o 00:02:22.905 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:22.905 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:23.164 LIB libspdk_event_ublk.a 00:02:23.164 LIB libspdk_event_nbd.a 00:02:23.164 LIB libspdk_event_scsi.a 00:02:23.164 LIB libspdk_event_nvmf.a 00:02:23.424 CC module/event/subsystems/iscsi/iscsi.o 00:02:23.424 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:23.424 LIB libspdk_event_vhost_scsi.a 00:02:23.424 LIB libspdk_event_iscsi.a 00:02:23.993 CC app/trace_record/trace_record.o 00:02:23.993 CC app/spdk_nvme_perf/perf.o 00:02:23.993 TEST_HEADER include/spdk/accel_module.h 00:02:23.993 TEST_HEADER include/spdk/accel.h 00:02:23.993 TEST_HEADER include/spdk/assert.h 00:02:23.993 TEST_HEADER include/spdk/base64.h 00:02:23.993 TEST_HEADER include/spdk/bdev.h 00:02:23.993 TEST_HEADER include/spdk/barrier.h 00:02:23.993 TEST_HEADER include/spdk/bdev_module.h 00:02:23.993 TEST_HEADER include/spdk/bdev_zone.h 00:02:23.993 CXX app/trace/trace.o 00:02:23.993 TEST_HEADER include/spdk/bit_pool.h 00:02:23.993 TEST_HEADER include/spdk/blob_bdev.h 00:02:23.993 TEST_HEADER include/spdk/bit_array.h 00:02:23.993 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:23.993 TEST_HEADER include/spdk/blobfs.h 00:02:23.993 CC app/spdk_nvme_identify/identify.o 00:02:23.993 TEST_HEADER include/spdk/conf.h 00:02:23.993 TEST_HEADER include/spdk/blob.h 00:02:23.993 TEST_HEADER include/spdk/cpuset.h 00:02:23.993 TEST_HEADER include/spdk/config.h 00:02:23.993 TEST_HEADER include/spdk/crc16.h 00:02:23.993 TEST_HEADER include/spdk/crc32.h 00:02:23.993 TEST_HEADER include/spdk/dif.h 00:02:23.993 TEST_HEADER include/spdk/crc64.h 00:02:23.993 TEST_HEADER include/spdk/dma.h 00:02:23.993 TEST_HEADER include/spdk/endian.h 00:02:23.993 CC app/spdk_nvme_discover/discovery_aer.o 00:02:23.993 TEST_HEADER include/spdk/env.h 00:02:23.993 TEST_HEADER include/spdk/event.h 00:02:23.993 TEST_HEADER include/spdk/env_dpdk.h 00:02:23.993 TEST_HEADER include/spdk/fd_group.h 00:02:23.993 TEST_HEADER include/spdk/file.h 00:02:23.993 TEST_HEADER include/spdk/fd.h 00:02:23.993 TEST_HEADER include/spdk/gpt_spec.h 00:02:23.993 TEST_HEADER include/spdk/ftl.h 00:02:23.993 TEST_HEADER include/spdk/hexlify.h 00:02:23.993 TEST_HEADER include/spdk/histogram_data.h 00:02:23.993 CC app/spdk_top/spdk_top.o 00:02:23.993 TEST_HEADER include/spdk/idxd.h 00:02:23.994 CC app/spdk_lspci/spdk_lspci.o 00:02:23.994 CC test/rpc_client/rpc_client_test.o 00:02:23.994 TEST_HEADER include/spdk/init.h 00:02:23.994 TEST_HEADER include/spdk/ioat.h 00:02:23.994 TEST_HEADER include/spdk/idxd_spec.h 00:02:23.994 TEST_HEADER include/spdk/ioat_spec.h 00:02:23.994 TEST_HEADER include/spdk/iscsi_spec.h 00:02:23.994 TEST_HEADER include/spdk/json.h 00:02:23.994 TEST_HEADER include/spdk/jsonrpc.h 00:02:23.994 TEST_HEADER include/spdk/log.h 00:02:23.994 TEST_HEADER include/spdk/likely.h 00:02:23.994 TEST_HEADER include/spdk/lvol.h 00:02:23.994 TEST_HEADER include/spdk/mmio.h 00:02:23.994 TEST_HEADER include/spdk/memory.h 00:02:23.994 TEST_HEADER include/spdk/nbd.h 00:02:23.994 TEST_HEADER include/spdk/notify.h 00:02:23.994 TEST_HEADER include/spdk/nvme.h 00:02:23.994 TEST_HEADER include/spdk/nvme_intel.h 00:02:23.994 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:23.994 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:23.994 TEST_HEADER include/spdk/nvme_spec.h 00:02:23.994 CC app/iscsi_tgt/iscsi_tgt.o 00:02:23.994 TEST_HEADER include/spdk/nvme_zns.h 00:02:23.994 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:23.994 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:23.994 TEST_HEADER include/spdk/nvmf.h 00:02:23.994 TEST_HEADER include/spdk/nvmf_spec.h 00:02:23.994 TEST_HEADER include/spdk/nvmf_transport.h 00:02:23.994 TEST_HEADER include/spdk/opal.h 00:02:23.994 TEST_HEADER include/spdk/opal_spec.h 00:02:23.994 TEST_HEADER include/spdk/pci_ids.h 00:02:23.994 TEST_HEADER include/spdk/pipe.h 00:02:23.994 TEST_HEADER include/spdk/queue.h 00:02:23.994 TEST_HEADER include/spdk/reduce.h 00:02:23.994 TEST_HEADER include/spdk/rpc.h 00:02:23.994 TEST_HEADER include/spdk/scsi.h 00:02:23.994 TEST_HEADER include/spdk/scheduler.h 00:02:23.994 TEST_HEADER include/spdk/scsi_spec.h 00:02:23.994 TEST_HEADER include/spdk/sock.h 00:02:23.994 TEST_HEADER include/spdk/stdinc.h 00:02:23.994 TEST_HEADER include/spdk/thread.h 00:02:23.994 TEST_HEADER include/spdk/string.h 00:02:23.994 TEST_HEADER include/spdk/trace.h 00:02:23.994 TEST_HEADER include/spdk/trace_parser.h 00:02:23.994 TEST_HEADER include/spdk/tree.h 00:02:23.994 TEST_HEADER include/spdk/ublk.h 00:02:23.994 TEST_HEADER include/spdk/uuid.h 00:02:23.994 TEST_HEADER include/spdk/util.h 00:02:23.994 TEST_HEADER include/spdk/version.h 00:02:23.994 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:23.994 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:23.994 TEST_HEADER include/spdk/vhost.h 00:02:23.994 TEST_HEADER include/spdk/vmd.h 00:02:23.994 TEST_HEADER include/spdk/xor.h 00:02:23.994 TEST_HEADER include/spdk/zipf.h 00:02:23.994 CXX test/cpp_headers/accel.o 00:02:23.994 CXX test/cpp_headers/accel_module.o 00:02:23.994 CXX test/cpp_headers/assert.o 00:02:23.994 CXX test/cpp_headers/base64.o 00:02:23.994 CXX test/cpp_headers/barrier.o 00:02:23.994 CC app/spdk_dd/spdk_dd.o 00:02:23.994 CXX test/cpp_headers/bdev.o 00:02:23.994 CXX test/cpp_headers/bdev_module.o 00:02:23.994 CXX test/cpp_headers/bit_array.o 00:02:23.994 CXX test/cpp_headers/bdev_zone.o 00:02:23.994 CXX test/cpp_headers/bit_pool.o 00:02:23.994 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:23.994 CXX test/cpp_headers/blob_bdev.o 00:02:23.994 CXX test/cpp_headers/blobfs_bdev.o 00:02:23.994 CXX test/cpp_headers/blobfs.o 00:02:23.994 CC app/spdk_tgt/spdk_tgt.o 00:02:23.994 CXX test/cpp_headers/blob.o 00:02:23.994 CXX test/cpp_headers/conf.o 00:02:23.994 CXX test/cpp_headers/config.o 00:02:23.994 CXX test/cpp_headers/cpuset.o 00:02:23.994 CXX test/cpp_headers/crc16.o 00:02:23.994 CXX test/cpp_headers/crc32.o 00:02:23.994 CXX test/cpp_headers/crc64.o 00:02:23.994 CXX test/cpp_headers/dif.o 00:02:23.994 CXX test/cpp_headers/dma.o 00:02:23.994 CC app/nvmf_tgt/nvmf_main.o 00:02:23.994 CXX test/cpp_headers/endian.o 00:02:23.994 CXX test/cpp_headers/env_dpdk.o 00:02:23.994 CXX test/cpp_headers/env.o 00:02:23.994 CXX test/cpp_headers/event.o 00:02:23.994 CXX test/cpp_headers/fd_group.o 00:02:23.994 CXX test/cpp_headers/fd.o 00:02:23.994 CXX test/cpp_headers/file.o 00:02:23.994 CC app/vhost/vhost.o 00:02:23.994 CXX test/cpp_headers/ftl.o 00:02:23.994 CXX test/cpp_headers/gpt_spec.o 00:02:23.994 CXX test/cpp_headers/hexlify.o 00:02:23.994 CXX test/cpp_headers/histogram_data.o 00:02:23.994 CXX test/cpp_headers/idxd.o 00:02:23.994 CXX test/cpp_headers/idxd_spec.o 00:02:23.994 CXX test/cpp_headers/init.o 00:02:23.994 CC examples/nvme/reconnect/reconnect.o 00:02:23.994 CC test/event/event_perf/event_perf.o 00:02:23.994 CC test/app/jsoncat/jsoncat.o 00:02:23.994 CXX test/cpp_headers/ioat.o 00:02:23.994 CC examples/nvme/arbitration/arbitration.o 00:02:23.994 CC test/event/reactor/reactor.o 00:02:23.994 CC test/event/reactor_perf/reactor_perf.o 00:02:23.994 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:23.994 CC test/app/stub/stub.o 00:02:23.994 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:23.994 CC examples/nvme/hello_world/hello_world.o 00:02:23.994 CC test/app/histogram_perf/histogram_perf.o 00:02:23.994 CC test/env/memory/memory_ut.o 00:02:23.994 CC examples/nvme/hotplug/hotplug.o 00:02:23.994 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:23.994 CC test/env/pci/pci_ut.o 00:02:23.994 CC examples/nvme/abort/abort.o 00:02:23.994 CC examples/vmd/lsvmd/lsvmd.o 00:02:23.994 CC test/env/vtophys/vtophys.o 00:02:23.994 CC examples/vmd/led/led.o 00:02:23.994 CC examples/ioat/perf/perf.o 00:02:23.994 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:23.994 CC examples/ioat/verify/verify.o 00:02:23.994 CC test/nvme/e2edp/nvme_dp.o 00:02:23.994 CC test/nvme/reset/reset.o 00:02:23.994 CC examples/accel/perf/accel_perf.o 00:02:23.994 CC test/nvme/aer/aer.o 00:02:23.994 CC test/nvme/sgl/sgl.o 00:02:23.994 CC test/thread/poller_perf/poller_perf.o 00:02:23.994 CC test/nvme/overhead/overhead.o 00:02:23.994 CC examples/idxd/perf/perf.o 00:02:23.994 CC examples/sock/hello_world/hello_sock.o 00:02:23.994 CC test/thread/lock/spdk_lock.o 00:02:23.994 CC test/nvme/err_injection/err_injection.o 00:02:23.994 CC test/nvme/reserve/reserve.o 00:02:23.994 CC test/nvme/compliance/nvme_compliance.o 00:02:23.994 CC test/nvme/fdp/fdp.o 00:02:23.994 CC test/nvme/simple_copy/simple_copy.o 00:02:23.994 CC app/fio/nvme/fio_plugin.o 00:02:23.994 CC test/nvme/startup/startup.o 00:02:23.994 CC examples/util/zipf/zipf.o 00:02:23.994 CC test/nvme/boot_partition/boot_partition.o 00:02:23.994 CC test/nvme/connect_stress/connect_stress.o 00:02:23.994 CC test/nvme/fused_ordering/fused_ordering.o 00:02:23.994 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:23.994 CC test/nvme/cuse/cuse.o 00:02:23.994 CC test/event/scheduler/scheduler.o 00:02:23.994 CC test/bdev/bdevio/bdevio.o 00:02:23.994 CC test/dma/test_dma/test_dma.o 00:02:23.994 CC test/event/app_repeat/app_repeat.o 00:02:23.994 CC examples/blob/cli/blobcli.o 00:02:23.994 CC test/app/bdev_svc/bdev_svc.o 00:02:23.994 LINK spdk_lspci 00:02:23.994 CC test/accel/dif/dif.o 00:02:23.994 CC test/blobfs/mkfs/mkfs.o 00:02:23.994 CC examples/blob/hello_world/hello_blob.o 00:02:23.994 CC app/fio/bdev/fio_plugin.o 00:02:23.994 CC examples/thread/thread/thread_ex.o 00:02:23.994 CC examples/bdev/bdevperf/bdevperf.o 00:02:23.994 CC examples/nvmf/nvmf/nvmf.o 00:02:23.994 CC examples/bdev/hello_world/hello_bdev.o 00:02:23.994 CC test/env/mem_callbacks/mem_callbacks.o 00:02:23.994 CC test/lvol/esnap/esnap.o 00:02:23.994 LINK rpc_client_test 00:02:23.994 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:23.994 LINK spdk_nvme_discover 00:02:24.261 LINK spdk_trace_record 00:02:24.261 CXX test/cpp_headers/ioat_spec.o 00:02:24.261 CXX test/cpp_headers/iscsi_spec.o 00:02:24.261 CXX test/cpp_headers/json.o 00:02:24.261 CXX test/cpp_headers/jsonrpc.o 00:02:24.261 CXX test/cpp_headers/likely.o 00:02:24.261 CXX test/cpp_headers/log.o 00:02:24.261 CXX test/cpp_headers/lvol.o 00:02:24.261 LINK reactor 00:02:24.261 CXX test/cpp_headers/memory.o 00:02:24.261 CXX test/cpp_headers/mmio.o 00:02:24.261 CXX test/cpp_headers/nbd.o 00:02:24.261 CXX test/cpp_headers/notify.o 00:02:24.261 CXX test/cpp_headers/nvme.o 00:02:24.261 LINK jsoncat 00:02:24.261 CXX test/cpp_headers/nvme_intel.o 00:02:24.261 CXX test/cpp_headers/nvme_ocssd.o 00:02:24.261 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:24.261 CXX test/cpp_headers/nvme_spec.o 00:02:24.261 CXX test/cpp_headers/nvme_zns.o 00:02:24.261 CXX test/cpp_headers/nvmf_cmd.o 00:02:24.261 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:24.261 LINK interrupt_tgt 00:02:24.261 CXX test/cpp_headers/nvmf.o 00:02:24.261 CXX test/cpp_headers/nvmf_spec.o 00:02:24.261 CXX test/cpp_headers/nvmf_transport.o 00:02:24.261 LINK event_perf 00:02:24.261 CXX test/cpp_headers/opal.o 00:02:24.261 LINK histogram_perf 00:02:24.261 LINK lsvmd 00:02:24.261 CXX test/cpp_headers/opal_spec.o 00:02:24.261 LINK reactor_perf 00:02:24.261 CXX test/cpp_headers/pci_ids.o 00:02:24.261 CXX test/cpp_headers/pipe.o 00:02:24.261 LINK nvmf_tgt 00:02:24.261 CXX test/cpp_headers/queue.o 00:02:24.261 LINK iscsi_tgt 00:02:24.261 LINK vtophys 00:02:24.261 CXX test/cpp_headers/reduce.o 00:02:24.261 CXX test/cpp_headers/rpc.o 00:02:24.261 LINK env_dpdk_post_init 00:02:24.261 LINK vhost 00:02:24.261 CXX test/cpp_headers/scheduler.o 00:02:24.261 LINK led 00:02:24.261 LINK poller_perf 00:02:24.261 CXX test/cpp_headers/scsi.o 00:02:24.261 LINK zipf 00:02:24.261 CXX test/cpp_headers/scsi_spec.o 00:02:24.261 CXX test/cpp_headers/sock.o 00:02:24.261 LINK stub 00:02:24.261 LINK pmr_persistence 00:02:24.261 LINK spdk_tgt 00:02:24.261 LINK app_repeat 00:02:24.261 LINK connect_stress 00:02:24.261 LINK startup 00:02:24.261 CXX test/cpp_headers/stdinc.o 00:02:24.261 LINK boot_partition 00:02:24.261 LINK doorbell_aers 00:02:24.261 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:24.261 LINK err_injection 00:02:24.261 LINK ioat_perf 00:02:24.261 LINK cmb_copy 00:02:24.261 LINK reserve 00:02:24.261 LINK fused_ordering 00:02:24.261 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:24.261 LINK hello_world 00:02:24.261 LINK verify 00:02:24.261 LINK hotplug 00:02:24.261 LINK bdev_svc 00:02:24.261 LINK hello_sock 00:02:24.261 LINK simple_copy 00:02:24.261 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:24.261 LINK mkfs 00:02:24.261 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:24.261 LINK scheduler 00:02:24.261 LINK reset 00:02:24.261 LINK nvme_dp 00:02:24.261 LINK fdp 00:02:24.261 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:24.261 LINK aer 00:02:24.261 LINK sgl 00:02:24.261 LINK hello_blob 00:02:24.261 CXX test/cpp_headers/string.o 00:02:24.261 LINK overhead 00:02:24.522 CXX test/cpp_headers/thread.o 00:02:24.522 LINK thread 00:02:24.522 CXX test/cpp_headers/trace.o 00:02:24.522 CXX test/cpp_headers/trace_parser.o 00:02:24.522 LINK hello_bdev 00:02:24.522 CXX test/cpp_headers/tree.o 00:02:24.522 LINK spdk_trace 00:02:24.522 CXX test/cpp_headers/ublk.o 00:02:24.522 CXX test/cpp_headers/util.o 00:02:24.522 CXX test/cpp_headers/uuid.o 00:02:24.522 CXX test/cpp_headers/version.o 00:02:24.522 CXX test/cpp_headers/vfio_user_pci.o 00:02:24.522 CXX test/cpp_headers/vfio_user_spec.o 00:02:24.522 LINK reconnect 00:02:24.522 CXX test/cpp_headers/vhost.o 00:02:24.522 CXX test/cpp_headers/vmd.o 00:02:24.522 CXX test/cpp_headers/xor.o 00:02:24.522 CXX test/cpp_headers/zipf.o 00:02:24.522 LINK arbitration 00:02:24.522 LINK nvmf 00:02:24.522 LINK abort 00:02:24.522 LINK test_dma 00:02:24.522 LINK spdk_dd 00:02:24.522 LINK nvme_compliance 00:02:24.522 LINK bdevio 00:02:24.522 LINK idxd_perf 00:02:24.522 LINK nvme_manage 00:02:24.522 LINK dif 00:02:24.522 LINK accel_perf 00:02:24.522 LINK pci_ut 00:02:24.781 LINK nvme_fuzz 00:02:24.781 LINK llvm_vfio_fuzz 00:02:24.781 LINK blobcli 00:02:24.781 LINK spdk_nvme_perf 00:02:24.781 LINK mem_callbacks 00:02:24.781 LINK spdk_nvme 00:02:24.781 LINK spdk_bdev 00:02:25.039 LINK vhost_fuzz 00:02:25.039 LINK spdk_nvme_identify 00:02:25.039 LINK spdk_top 00:02:25.039 LINK cuse 00:02:25.039 LINK bdevperf 00:02:25.039 LINK memory_ut 00:02:25.039 LINK llvm_nvme_fuzz 00:02:25.606 LINK spdk_lock 00:02:25.606 LINK iscsi_fuzz 00:02:27.511 LINK esnap 00:02:27.770 00:02:27.770 real 0m41.769s 00:02:27.770 user 5m43.704s 00:02:27.770 sys 2m50.552s 00:02:27.770 06:11:57 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:27.770 06:11:57 -- common/autotest_common.sh@10 -- $ set +x 00:02:27.770 ************************************ 00:02:27.770 END TEST make 00:02:27.770 ************************************ 00:02:28.030 06:11:57 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:28.030 06:11:57 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:28.030 06:11:57 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:28.030 06:11:57 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:28.030 06:11:57 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:28.030 06:11:57 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:28.030 06:11:57 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:28.030 06:11:57 -- scripts/common.sh@335 -- # IFS=.-: 00:02:28.030 06:11:57 -- scripts/common.sh@335 -- # read -ra ver1 00:02:28.030 06:11:57 -- scripts/common.sh@336 -- # IFS=.-: 00:02:28.030 06:11:57 -- scripts/common.sh@336 -- # read -ra ver2 00:02:28.030 06:11:57 -- scripts/common.sh@337 -- # local 'op=<' 00:02:28.030 06:11:57 -- scripts/common.sh@339 -- # ver1_l=2 00:02:28.030 06:11:57 -- scripts/common.sh@340 -- # ver2_l=1 00:02:28.030 06:11:57 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:28.030 06:11:57 -- scripts/common.sh@343 -- # case "$op" in 00:02:28.030 06:11:57 -- scripts/common.sh@344 -- # : 1 00:02:28.030 06:11:57 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:28.030 06:11:57 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:28.030 06:11:57 -- scripts/common.sh@364 -- # decimal 1 00:02:28.030 06:11:57 -- scripts/common.sh@352 -- # local d=1 00:02:28.030 06:11:57 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:28.030 06:11:57 -- scripts/common.sh@354 -- # echo 1 00:02:28.030 06:11:57 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:28.030 06:11:57 -- scripts/common.sh@365 -- # decimal 2 00:02:28.030 06:11:57 -- scripts/common.sh@352 -- # local d=2 00:02:28.030 06:11:57 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:28.030 06:11:57 -- scripts/common.sh@354 -- # echo 2 00:02:28.030 06:11:57 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:28.030 06:11:57 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:28.030 06:11:57 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:28.030 06:11:57 -- scripts/common.sh@367 -- # return 0 00:02:28.031 06:11:57 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:28.031 06:11:57 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:28.031 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:28.031 --rc genhtml_branch_coverage=1 00:02:28.031 --rc genhtml_function_coverage=1 00:02:28.031 --rc genhtml_legend=1 00:02:28.031 --rc geninfo_all_blocks=1 00:02:28.031 --rc geninfo_unexecuted_blocks=1 00:02:28.031 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:28.031 ' 00:02:28.031 06:11:57 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:28.031 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:28.031 --rc genhtml_branch_coverage=1 00:02:28.031 --rc genhtml_function_coverage=1 00:02:28.031 --rc genhtml_legend=1 00:02:28.031 --rc geninfo_all_blocks=1 00:02:28.031 --rc geninfo_unexecuted_blocks=1 00:02:28.031 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:28.031 ' 00:02:28.031 06:11:57 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:28.031 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:28.031 --rc genhtml_branch_coverage=1 00:02:28.031 --rc genhtml_function_coverage=1 00:02:28.031 --rc genhtml_legend=1 00:02:28.031 --rc geninfo_all_blocks=1 00:02:28.031 --rc geninfo_unexecuted_blocks=1 00:02:28.031 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:28.031 ' 00:02:28.031 06:11:57 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:28.031 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:28.031 --rc genhtml_branch_coverage=1 00:02:28.031 --rc genhtml_function_coverage=1 00:02:28.031 --rc genhtml_legend=1 00:02:28.031 --rc geninfo_all_blocks=1 00:02:28.031 --rc geninfo_unexecuted_blocks=1 00:02:28.031 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:28.031 ' 00:02:28.031 06:11:57 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:28.031 06:11:57 -- nvmf/common.sh@7 -- # uname -s 00:02:28.031 06:11:57 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:28.031 06:11:57 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:28.031 06:11:57 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:28.031 06:11:57 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:28.031 06:11:57 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:28.031 06:11:57 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:28.031 06:11:57 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:28.031 06:11:57 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:28.031 06:11:57 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:28.031 06:11:57 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:28.031 06:11:57 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:02:28.031 06:11:57 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:02:28.031 06:11:57 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:28.031 06:11:57 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:28.031 06:11:57 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:28.031 06:11:57 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:28.031 06:11:57 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:28.031 06:11:57 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:28.031 06:11:57 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:28.031 06:11:57 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:28.031 06:11:57 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:28.031 06:11:57 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:28.031 06:11:57 -- paths/export.sh@5 -- # export PATH 00:02:28.031 06:11:57 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:28.031 06:11:57 -- nvmf/common.sh@46 -- # : 0 00:02:28.031 06:11:57 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:02:28.031 06:11:57 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:02:28.031 06:11:57 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:02:28.031 06:11:57 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:28.031 06:11:57 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:28.031 06:11:57 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:02:28.031 06:11:57 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:02:28.031 06:11:57 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:02:28.031 06:11:57 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:28.031 06:11:57 -- spdk/autotest.sh@32 -- # uname -s 00:02:28.031 06:11:57 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:28.031 06:11:57 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:28.031 06:11:57 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:28.031 06:11:57 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:28.031 06:11:57 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:28.031 06:11:57 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:28.031 06:11:57 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:28.031 06:11:57 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:28.031 06:11:57 -- spdk/autotest.sh@48 -- # udevadm_pid=4148275 00:02:28.031 06:11:57 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:28.031 06:11:57 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:28.031 06:11:57 -- spdk/autotest.sh@54 -- # echo 4148277 00:02:28.031 06:11:57 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:28.031 06:11:57 -- spdk/autotest.sh@56 -- # echo 4148278 00:02:28.031 06:11:57 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:02:28.031 06:11:57 -- spdk/autotest.sh@60 -- # echo 4148279 00:02:28.031 06:11:57 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:28.031 06:11:57 -- spdk/autotest.sh@62 -- # echo 4148280 00:02:28.031 06:11:57 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:28.031 06:11:57 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:28.031 06:11:57 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:02:28.031 06:11:57 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:28.031 06:11:57 -- common/autotest_common.sh@10 -- # set +x 00:02:28.031 06:11:57 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:28.031 06:11:57 -- spdk/autotest.sh@70 -- # create_test_list 00:02:28.031 06:11:57 -- common/autotest_common.sh@746 -- # xtrace_disable 00:02:28.031 06:11:57 -- common/autotest_common.sh@10 -- # set +x 00:02:28.031 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:02:28.031 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:02:28.031 06:11:57 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:28.031 06:11:57 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:28.031 06:11:57 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:28.031 06:11:57 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:28.031 06:11:57 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:28.031 06:11:57 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:02:28.031 06:11:57 -- common/autotest_common.sh@1450 -- # uname 00:02:28.031 06:11:57 -- common/autotest_common.sh@1450 -- # '[' Linux = FreeBSD ']' 00:02:28.031 06:11:57 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:02:28.031 06:11:57 -- common/autotest_common.sh@1470 -- # uname 00:02:28.291 06:11:57 -- common/autotest_common.sh@1470 -- # [[ Linux = FreeBSD ]] 00:02:28.291 06:11:57 -- spdk/autotest.sh@79 -- # [[ y == y ]] 00:02:28.291 06:11:57 -- spdk/autotest.sh@81 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:02:28.291 lcov: LCOV version 1.15 00:02:28.291 06:11:57 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:02:30.198 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:02:30.198 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:02:30.198 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:02:42.417 06:12:11 -- spdk/autotest.sh@87 -- # timing_enter pre_cleanup 00:02:42.417 06:12:11 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:42.417 06:12:11 -- common/autotest_common.sh@10 -- # set +x 00:02:42.417 06:12:11 -- spdk/autotest.sh@89 -- # rm -f 00:02:42.417 06:12:11 -- spdk/autotest.sh@92 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:45.709 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:02:45.709 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:02:45.968 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:02:45.968 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:02:45.968 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:02:45.968 06:12:15 -- spdk/autotest.sh@94 -- # get_zoned_devs 00:02:45.968 06:12:15 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:02:45.968 06:12:15 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:02:45.968 06:12:15 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:02:45.968 06:12:15 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:02:45.969 06:12:15 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:02:45.969 06:12:15 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:02:45.969 06:12:15 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:45.969 06:12:15 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:02:45.969 06:12:15 -- spdk/autotest.sh@96 -- # (( 0 > 0 )) 00:02:45.969 06:12:15 -- spdk/autotest.sh@108 -- # ls /dev/nvme0n1 00:02:45.969 06:12:15 -- spdk/autotest.sh@108 -- # grep -v p 00:02:45.969 06:12:15 -- spdk/autotest.sh@108 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:02:45.969 06:12:15 -- spdk/autotest.sh@110 -- # [[ -z '' ]] 00:02:45.969 06:12:15 -- spdk/autotest.sh@111 -- # block_in_use /dev/nvme0n1 00:02:45.969 06:12:15 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:02:45.969 06:12:15 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:45.969 No valid GPT data, bailing 00:02:45.969 06:12:15 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:45.969 06:12:15 -- scripts/common.sh@393 -- # pt= 00:02:45.969 06:12:15 -- scripts/common.sh@394 -- # return 1 00:02:45.969 06:12:15 -- spdk/autotest.sh@112 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:45.969 1+0 records in 00:02:45.969 1+0 records out 00:02:45.969 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00562564 s, 186 MB/s 00:02:45.969 06:12:15 -- spdk/autotest.sh@116 -- # sync 00:02:45.969 06:12:15 -- spdk/autotest.sh@118 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:45.969 06:12:15 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:45.969 06:12:15 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:02:52.537 06:12:21 -- spdk/autotest.sh@122 -- # uname -s 00:02:52.537 06:12:21 -- spdk/autotest.sh@122 -- # '[' Linux = Linux ']' 00:02:52.537 06:12:21 -- spdk/autotest.sh@123 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:52.537 06:12:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:52.537 06:12:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:52.537 06:12:21 -- common/autotest_common.sh@10 -- # set +x 00:02:52.537 ************************************ 00:02:52.537 START TEST setup.sh 00:02:52.537 ************************************ 00:02:52.537 06:12:21 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:52.537 * Looking for test storage... 00:02:52.537 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:52.537 06:12:21 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:52.537 06:12:21 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:52.537 06:12:21 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:52.537 06:12:21 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:52.537 06:12:21 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:52.537 06:12:21 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:52.537 06:12:21 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:52.537 06:12:21 -- scripts/common.sh@335 -- # IFS=.-: 00:02:52.537 06:12:21 -- scripts/common.sh@335 -- # read -ra ver1 00:02:52.537 06:12:21 -- scripts/common.sh@336 -- # IFS=.-: 00:02:52.537 06:12:21 -- scripts/common.sh@336 -- # read -ra ver2 00:02:52.537 06:12:21 -- scripts/common.sh@337 -- # local 'op=<' 00:02:52.537 06:12:21 -- scripts/common.sh@339 -- # ver1_l=2 00:02:52.537 06:12:21 -- scripts/common.sh@340 -- # ver2_l=1 00:02:52.537 06:12:21 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:52.537 06:12:21 -- scripts/common.sh@343 -- # case "$op" in 00:02:52.537 06:12:21 -- scripts/common.sh@344 -- # : 1 00:02:52.537 06:12:21 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:52.537 06:12:21 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:52.537 06:12:21 -- scripts/common.sh@364 -- # decimal 1 00:02:52.537 06:12:21 -- scripts/common.sh@352 -- # local d=1 00:02:52.537 06:12:21 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:52.537 06:12:21 -- scripts/common.sh@354 -- # echo 1 00:02:52.537 06:12:21 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:52.537 06:12:21 -- scripts/common.sh@365 -- # decimal 2 00:02:52.537 06:12:21 -- scripts/common.sh@352 -- # local d=2 00:02:52.537 06:12:21 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:52.537 06:12:21 -- scripts/common.sh@354 -- # echo 2 00:02:52.537 06:12:21 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:52.537 06:12:21 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:52.537 06:12:21 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:52.537 06:12:21 -- scripts/common.sh@367 -- # return 0 00:02:52.537 06:12:21 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:52.537 06:12:21 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:52.537 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:52.537 --rc genhtml_branch_coverage=1 00:02:52.537 --rc genhtml_function_coverage=1 00:02:52.537 --rc genhtml_legend=1 00:02:52.538 --rc geninfo_all_blocks=1 00:02:52.538 --rc geninfo_unexecuted_blocks=1 00:02:52.538 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:52.538 ' 00:02:52.538 06:12:21 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:52.538 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:52.538 --rc genhtml_branch_coverage=1 00:02:52.538 --rc genhtml_function_coverage=1 00:02:52.538 --rc genhtml_legend=1 00:02:52.538 --rc geninfo_all_blocks=1 00:02:52.538 --rc geninfo_unexecuted_blocks=1 00:02:52.538 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:52.538 ' 00:02:52.538 06:12:21 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:52.538 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:52.538 --rc genhtml_branch_coverage=1 00:02:52.538 --rc genhtml_function_coverage=1 00:02:52.538 --rc genhtml_legend=1 00:02:52.538 --rc geninfo_all_blocks=1 00:02:52.538 --rc geninfo_unexecuted_blocks=1 00:02:52.538 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:52.538 ' 00:02:52.538 06:12:21 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:52.538 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:52.538 --rc genhtml_branch_coverage=1 00:02:52.538 --rc genhtml_function_coverage=1 00:02:52.538 --rc genhtml_legend=1 00:02:52.538 --rc geninfo_all_blocks=1 00:02:52.538 --rc geninfo_unexecuted_blocks=1 00:02:52.538 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:52.538 ' 00:02:52.538 06:12:21 -- setup/test-setup.sh@10 -- # uname -s 00:02:52.538 06:12:21 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:02:52.538 06:12:21 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:52.538 06:12:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:52.538 06:12:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:52.538 06:12:21 -- common/autotest_common.sh@10 -- # set +x 00:02:52.538 ************************************ 00:02:52.538 START TEST acl 00:02:52.538 ************************************ 00:02:52.538 06:12:21 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:52.538 * Looking for test storage... 00:02:52.538 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:52.538 06:12:21 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:52.538 06:12:21 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:52.538 06:12:21 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:52.538 06:12:22 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:52.538 06:12:22 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:52.538 06:12:22 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:52.538 06:12:22 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:52.538 06:12:22 -- scripts/common.sh@335 -- # IFS=.-: 00:02:52.538 06:12:22 -- scripts/common.sh@335 -- # read -ra ver1 00:02:52.538 06:12:22 -- scripts/common.sh@336 -- # IFS=.-: 00:02:52.538 06:12:22 -- scripts/common.sh@336 -- # read -ra ver2 00:02:52.538 06:12:22 -- scripts/common.sh@337 -- # local 'op=<' 00:02:52.538 06:12:22 -- scripts/common.sh@339 -- # ver1_l=2 00:02:52.538 06:12:22 -- scripts/common.sh@340 -- # ver2_l=1 00:02:52.538 06:12:22 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:52.538 06:12:22 -- scripts/common.sh@343 -- # case "$op" in 00:02:52.538 06:12:22 -- scripts/common.sh@344 -- # : 1 00:02:52.538 06:12:22 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:52.538 06:12:22 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:52.538 06:12:22 -- scripts/common.sh@364 -- # decimal 1 00:02:52.538 06:12:22 -- scripts/common.sh@352 -- # local d=1 00:02:52.538 06:12:22 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:52.538 06:12:22 -- scripts/common.sh@354 -- # echo 1 00:02:52.538 06:12:22 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:52.538 06:12:22 -- scripts/common.sh@365 -- # decimal 2 00:02:52.538 06:12:22 -- scripts/common.sh@352 -- # local d=2 00:02:52.538 06:12:22 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:52.538 06:12:22 -- scripts/common.sh@354 -- # echo 2 00:02:52.538 06:12:22 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:52.538 06:12:22 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:52.538 06:12:22 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:52.538 06:12:22 -- scripts/common.sh@367 -- # return 0 00:02:52.538 06:12:22 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:52.538 06:12:22 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:52.538 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:52.538 --rc genhtml_branch_coverage=1 00:02:52.538 --rc genhtml_function_coverage=1 00:02:52.538 --rc genhtml_legend=1 00:02:52.538 --rc geninfo_all_blocks=1 00:02:52.538 --rc geninfo_unexecuted_blocks=1 00:02:52.538 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:52.538 ' 00:02:52.538 06:12:22 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:52.538 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:52.538 --rc genhtml_branch_coverage=1 00:02:52.538 --rc genhtml_function_coverage=1 00:02:52.538 --rc genhtml_legend=1 00:02:52.538 --rc geninfo_all_blocks=1 00:02:52.538 --rc geninfo_unexecuted_blocks=1 00:02:52.538 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:52.538 ' 00:02:52.538 06:12:22 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:52.538 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:52.538 --rc genhtml_branch_coverage=1 00:02:52.538 --rc genhtml_function_coverage=1 00:02:52.538 --rc genhtml_legend=1 00:02:52.538 --rc geninfo_all_blocks=1 00:02:52.538 --rc geninfo_unexecuted_blocks=1 00:02:52.538 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:52.538 ' 00:02:52.538 06:12:22 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:52.538 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:52.538 --rc genhtml_branch_coverage=1 00:02:52.538 --rc genhtml_function_coverage=1 00:02:52.538 --rc genhtml_legend=1 00:02:52.538 --rc geninfo_all_blocks=1 00:02:52.538 --rc geninfo_unexecuted_blocks=1 00:02:52.538 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:52.538 ' 00:02:52.538 06:12:22 -- setup/acl.sh@10 -- # get_zoned_devs 00:02:52.538 06:12:22 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:02:52.538 06:12:22 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:02:52.538 06:12:22 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:02:52.538 06:12:22 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:02:52.538 06:12:22 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:02:52.538 06:12:22 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:02:52.538 06:12:22 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:52.538 06:12:22 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:02:52.538 06:12:22 -- setup/acl.sh@12 -- # devs=() 00:02:52.538 06:12:22 -- setup/acl.sh@12 -- # declare -a devs 00:02:52.538 06:12:22 -- setup/acl.sh@13 -- # drivers=() 00:02:52.538 06:12:22 -- setup/acl.sh@13 -- # declare -A drivers 00:02:52.538 06:12:22 -- setup/acl.sh@51 -- # setup reset 00:02:52.538 06:12:22 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:52.538 06:12:22 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:56.737 06:12:25 -- setup/acl.sh@52 -- # collect_setup_devs 00:02:56.737 06:12:25 -- setup/acl.sh@16 -- # local dev driver 00:02:56.737 06:12:25 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:56.737 06:12:25 -- setup/acl.sh@15 -- # setup output status 00:02:56.737 06:12:25 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:56.737 06:12:25 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:59.277 Hugepages 00:02:59.277 node hugesize free / total 00:02:59.277 06:12:28 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:59.277 06:12:28 -- setup/acl.sh@19 -- # continue 00:02:59.278 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.278 06:12:28 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:59.278 06:12:28 -- setup/acl.sh@19 -- # continue 00:02:59.278 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.278 06:12:28 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:59.278 06:12:28 -- setup/acl.sh@19 -- # continue 00:02:59.278 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.278 00:02:59.278 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:59.278 06:12:28 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:59.278 06:12:28 -- setup/acl.sh@19 -- # continue 00:02:59.278 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.537 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.537 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.537 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.537 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.537 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.537 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.537 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.537 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.537 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.537 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.537 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.537 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.537 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.537 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.537 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.537 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.537 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.537 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.537 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.537 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.537 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.537 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.537 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.537 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:02:59.538 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.538 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.538 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.538 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:02:59.538 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.538 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.538 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.538 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:02:59.538 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.538 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.538 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.538 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:02:59.538 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.538 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.538 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.538 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:02:59.538 06:12:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:59.538 06:12:28 -- setup/acl.sh@20 -- # continue 00:02:59.538 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.538 06:12:28 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:02:59.538 06:12:28 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:02:59.538 06:12:28 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:02:59.538 06:12:28 -- setup/acl.sh@22 -- # devs+=("$dev") 00:02:59.538 06:12:28 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:02:59.538 06:12:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:59.538 06:12:29 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:02:59.538 06:12:29 -- setup/acl.sh@54 -- # run_test denied denied 00:02:59.538 06:12:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:59.538 06:12:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:59.538 06:12:29 -- common/autotest_common.sh@10 -- # set +x 00:02:59.538 ************************************ 00:02:59.538 START TEST denied 00:02:59.538 ************************************ 00:02:59.538 06:12:29 -- common/autotest_common.sh@1114 -- # denied 00:02:59.538 06:12:29 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:02:59.538 06:12:29 -- setup/acl.sh@38 -- # setup output config 00:02:59.538 06:12:29 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:02:59.538 06:12:29 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:59.538 06:12:29 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:03.733 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:03.733 06:12:32 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:03.733 06:12:32 -- setup/acl.sh@28 -- # local dev driver 00:03:03.733 06:12:32 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:03.733 06:12:32 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:03.733 06:12:32 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:03.733 06:12:32 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:03.733 06:12:32 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:03.733 06:12:32 -- setup/acl.sh@41 -- # setup reset 00:03:03.733 06:12:32 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:03.733 06:12:32 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:07.929 00:03:07.929 real 0m8.037s 00:03:07.929 user 0m2.562s 00:03:07.929 sys 0m4.834s 00:03:07.929 06:12:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:07.929 06:12:37 -- common/autotest_common.sh@10 -- # set +x 00:03:07.929 ************************************ 00:03:07.929 END TEST denied 00:03:07.929 ************************************ 00:03:07.929 06:12:37 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:07.929 06:12:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:07.929 06:12:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:07.929 06:12:37 -- common/autotest_common.sh@10 -- # set +x 00:03:07.929 ************************************ 00:03:07.929 START TEST allowed 00:03:07.929 ************************************ 00:03:07.929 06:12:37 -- common/autotest_common.sh@1114 -- # allowed 00:03:07.929 06:12:37 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:07.929 06:12:37 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:07.929 06:12:37 -- setup/acl.sh@45 -- # setup output config 00:03:07.929 06:12:37 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:07.929 06:12:37 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:13.220 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:13.221 06:12:41 -- setup/acl.sh@47 -- # verify 00:03:13.221 06:12:41 -- setup/acl.sh@28 -- # local dev driver 00:03:13.221 06:12:41 -- setup/acl.sh@48 -- # setup reset 00:03:13.221 06:12:41 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:13.221 06:12:41 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:16.516 00:03:16.516 real 0m8.819s 00:03:16.516 user 0m2.523s 00:03:16.516 sys 0m4.915s 00:03:16.516 06:12:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:16.516 06:12:45 -- common/autotest_common.sh@10 -- # set +x 00:03:16.516 ************************************ 00:03:16.516 END TEST allowed 00:03:16.516 ************************************ 00:03:16.516 00:03:16.516 real 0m24.080s 00:03:16.516 user 0m7.687s 00:03:16.516 sys 0m14.650s 00:03:16.516 06:12:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:16.516 06:12:45 -- common/autotest_common.sh@10 -- # set +x 00:03:16.516 ************************************ 00:03:16.516 END TEST acl 00:03:16.516 ************************************ 00:03:16.516 06:12:45 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:16.516 06:12:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:16.516 06:12:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:16.516 06:12:45 -- common/autotest_common.sh@10 -- # set +x 00:03:16.516 ************************************ 00:03:16.516 START TEST hugepages 00:03:16.516 ************************************ 00:03:16.516 06:12:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:16.778 * Looking for test storage... 00:03:16.778 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:16.778 06:12:46 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:16.778 06:12:46 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:16.778 06:12:46 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:16.778 06:12:46 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:16.778 06:12:46 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:16.778 06:12:46 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:16.778 06:12:46 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:16.778 06:12:46 -- scripts/common.sh@335 -- # IFS=.-: 00:03:16.778 06:12:46 -- scripts/common.sh@335 -- # read -ra ver1 00:03:16.778 06:12:46 -- scripts/common.sh@336 -- # IFS=.-: 00:03:16.778 06:12:46 -- scripts/common.sh@336 -- # read -ra ver2 00:03:16.778 06:12:46 -- scripts/common.sh@337 -- # local 'op=<' 00:03:16.778 06:12:46 -- scripts/common.sh@339 -- # ver1_l=2 00:03:16.778 06:12:46 -- scripts/common.sh@340 -- # ver2_l=1 00:03:16.778 06:12:46 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:16.778 06:12:46 -- scripts/common.sh@343 -- # case "$op" in 00:03:16.778 06:12:46 -- scripts/common.sh@344 -- # : 1 00:03:16.778 06:12:46 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:16.778 06:12:46 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:16.778 06:12:46 -- scripts/common.sh@364 -- # decimal 1 00:03:16.778 06:12:46 -- scripts/common.sh@352 -- # local d=1 00:03:16.778 06:12:46 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:16.778 06:12:46 -- scripts/common.sh@354 -- # echo 1 00:03:16.778 06:12:46 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:16.778 06:12:46 -- scripts/common.sh@365 -- # decimal 2 00:03:16.778 06:12:46 -- scripts/common.sh@352 -- # local d=2 00:03:16.778 06:12:46 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:16.778 06:12:46 -- scripts/common.sh@354 -- # echo 2 00:03:16.778 06:12:46 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:16.778 06:12:46 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:16.778 06:12:46 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:16.778 06:12:46 -- scripts/common.sh@367 -- # return 0 00:03:16.778 06:12:46 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:16.778 06:12:46 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:16.778 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:16.778 --rc genhtml_branch_coverage=1 00:03:16.778 --rc genhtml_function_coverage=1 00:03:16.778 --rc genhtml_legend=1 00:03:16.778 --rc geninfo_all_blocks=1 00:03:16.778 --rc geninfo_unexecuted_blocks=1 00:03:16.778 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:16.778 ' 00:03:16.778 06:12:46 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:16.778 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:16.778 --rc genhtml_branch_coverage=1 00:03:16.778 --rc genhtml_function_coverage=1 00:03:16.778 --rc genhtml_legend=1 00:03:16.778 --rc geninfo_all_blocks=1 00:03:16.778 --rc geninfo_unexecuted_blocks=1 00:03:16.778 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:16.778 ' 00:03:16.778 06:12:46 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:16.778 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:16.778 --rc genhtml_branch_coverage=1 00:03:16.778 --rc genhtml_function_coverage=1 00:03:16.778 --rc genhtml_legend=1 00:03:16.778 --rc geninfo_all_blocks=1 00:03:16.778 --rc geninfo_unexecuted_blocks=1 00:03:16.778 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:16.778 ' 00:03:16.778 06:12:46 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:16.778 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:16.778 --rc genhtml_branch_coverage=1 00:03:16.778 --rc genhtml_function_coverage=1 00:03:16.778 --rc genhtml_legend=1 00:03:16.778 --rc geninfo_all_blocks=1 00:03:16.778 --rc geninfo_unexecuted_blocks=1 00:03:16.778 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:16.778 ' 00:03:16.778 06:12:46 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:16.778 06:12:46 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:16.778 06:12:46 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:16.778 06:12:46 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:16.778 06:12:46 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:16.778 06:12:46 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:16.778 06:12:46 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:16.778 06:12:46 -- setup/common.sh@18 -- # local node= 00:03:16.778 06:12:46 -- setup/common.sh@19 -- # local var val 00:03:16.778 06:12:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.778 06:12:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.778 06:12:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.778 06:12:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.778 06:12:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.778 06:12:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.778 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.778 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41467312 kB' 'MemAvailable: 43048296 kB' 'Buffers: 4384 kB' 'Cached: 9640500 kB' 'SwapCached: 76 kB' 'Active: 6675008 kB' 'Inactive: 3555396 kB' 'Active(anon): 5762508 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 588948 kB' 'Mapped: 178644 kB' 'Shmem: 7893352 kB' 'KReclaimable: 567212 kB' 'Slab: 1559892 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 992680 kB' 'KernelStack: 21920 kB' 'PageTables: 8936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 10053604 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217924 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.779 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.779 06:12:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # continue 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.780 06:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.780 06:12:46 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:16.780 06:12:46 -- setup/common.sh@33 -- # echo 2048 00:03:16.780 06:12:46 -- setup/common.sh@33 -- # return 0 00:03:16.780 06:12:46 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:16.780 06:12:46 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:16.780 06:12:46 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:16.780 06:12:46 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:16.780 06:12:46 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:16.780 06:12:46 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:16.780 06:12:46 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:16.780 06:12:46 -- setup/hugepages.sh@207 -- # get_nodes 00:03:16.780 06:12:46 -- setup/hugepages.sh@27 -- # local node 00:03:16.780 06:12:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:16.780 06:12:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:16.780 06:12:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:16.780 06:12:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:16.780 06:12:46 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:16.780 06:12:46 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:16.780 06:12:46 -- setup/hugepages.sh@208 -- # clear_hp 00:03:16.780 06:12:46 -- setup/hugepages.sh@37 -- # local node hp 00:03:16.780 06:12:46 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:16.780 06:12:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:16.781 06:12:46 -- setup/hugepages.sh@41 -- # echo 0 00:03:16.781 06:12:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:16.781 06:12:46 -- setup/hugepages.sh@41 -- # echo 0 00:03:16.781 06:12:46 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:16.781 06:12:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:16.781 06:12:46 -- setup/hugepages.sh@41 -- # echo 0 00:03:16.781 06:12:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:16.781 06:12:46 -- setup/hugepages.sh@41 -- # echo 0 00:03:16.781 06:12:46 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:16.781 06:12:46 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:16.781 06:12:46 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:16.781 06:12:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:16.781 06:12:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:16.781 06:12:46 -- common/autotest_common.sh@10 -- # set +x 00:03:16.781 ************************************ 00:03:16.781 START TEST default_setup 00:03:16.781 ************************************ 00:03:16.781 06:12:46 -- common/autotest_common.sh@1114 -- # default_setup 00:03:16.781 06:12:46 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:16.781 06:12:46 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:16.781 06:12:46 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:16.781 06:12:46 -- setup/hugepages.sh@51 -- # shift 00:03:16.781 06:12:46 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:16.781 06:12:46 -- setup/hugepages.sh@52 -- # local node_ids 00:03:16.781 06:12:46 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:16.781 06:12:46 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:16.781 06:12:46 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:16.781 06:12:46 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:16.781 06:12:46 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:16.781 06:12:46 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:16.781 06:12:46 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:16.781 06:12:46 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:16.781 06:12:46 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:16.781 06:12:46 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:16.781 06:12:46 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:16.781 06:12:46 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:16.781 06:12:46 -- setup/hugepages.sh@73 -- # return 0 00:03:16.781 06:12:46 -- setup/hugepages.sh@137 -- # setup output 00:03:16.781 06:12:46 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:16.781 06:12:46 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:20.158 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:20.158 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:22.135 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:22.136 06:12:51 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:22.136 06:12:51 -- setup/hugepages.sh@89 -- # local node 00:03:22.136 06:12:51 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:22.136 06:12:51 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:22.136 06:12:51 -- setup/hugepages.sh@92 -- # local surp 00:03:22.136 06:12:51 -- setup/hugepages.sh@93 -- # local resv 00:03:22.136 06:12:51 -- setup/hugepages.sh@94 -- # local anon 00:03:22.136 06:12:51 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:22.136 06:12:51 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:22.136 06:12:51 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:22.136 06:12:51 -- setup/common.sh@18 -- # local node= 00:03:22.136 06:12:51 -- setup/common.sh@19 -- # local var val 00:03:22.136 06:12:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:22.136 06:12:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:22.136 06:12:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:22.136 06:12:51 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:22.136 06:12:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:22.136 06:12:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43650600 kB' 'MemAvailable: 45231584 kB' 'Buffers: 4384 kB' 'Cached: 9640644 kB' 'SwapCached: 76 kB' 'Active: 6676144 kB' 'Inactive: 3555396 kB' 'Active(anon): 5763644 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590052 kB' 'Mapped: 178564 kB' 'Shmem: 7893496 kB' 'KReclaimable: 567212 kB' 'Slab: 1559784 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 992572 kB' 'KernelStack: 21824 kB' 'PageTables: 8740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10055992 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217908 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.136 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.136 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:22.137 06:12:51 -- setup/common.sh@33 -- # echo 0 00:03:22.137 06:12:51 -- setup/common.sh@33 -- # return 0 00:03:22.137 06:12:51 -- setup/hugepages.sh@97 -- # anon=0 00:03:22.137 06:12:51 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:22.137 06:12:51 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:22.137 06:12:51 -- setup/common.sh@18 -- # local node= 00:03:22.137 06:12:51 -- setup/common.sh@19 -- # local var val 00:03:22.137 06:12:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:22.137 06:12:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:22.137 06:12:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:22.137 06:12:51 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:22.137 06:12:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:22.137 06:12:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43652388 kB' 'MemAvailable: 45233372 kB' 'Buffers: 4384 kB' 'Cached: 9640644 kB' 'SwapCached: 76 kB' 'Active: 6675140 kB' 'Inactive: 3555396 kB' 'Active(anon): 5762640 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 588912 kB' 'Mapped: 178632 kB' 'Shmem: 7893496 kB' 'KReclaimable: 567212 kB' 'Slab: 1559008 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991796 kB' 'KernelStack: 21808 kB' 'PageTables: 8968 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10055768 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217956 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.137 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.137 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.138 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.138 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.139 06:12:51 -- setup/common.sh@33 -- # echo 0 00:03:22.139 06:12:51 -- setup/common.sh@33 -- # return 0 00:03:22.139 06:12:51 -- setup/hugepages.sh@99 -- # surp=0 00:03:22.139 06:12:51 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:22.139 06:12:51 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:22.139 06:12:51 -- setup/common.sh@18 -- # local node= 00:03:22.139 06:12:51 -- setup/common.sh@19 -- # local var val 00:03:22.139 06:12:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:22.139 06:12:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:22.139 06:12:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:22.139 06:12:51 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:22.139 06:12:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:22.139 06:12:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:22.139 06:12:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43652740 kB' 'MemAvailable: 45233724 kB' 'Buffers: 4384 kB' 'Cached: 9640660 kB' 'SwapCached: 76 kB' 'Active: 6675836 kB' 'Inactive: 3555396 kB' 'Active(anon): 5763336 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589644 kB' 'Mapped: 178684 kB' 'Shmem: 7893512 kB' 'KReclaimable: 567212 kB' 'Slab: 1559084 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991872 kB' 'KernelStack: 22080 kB' 'PageTables: 9056 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10056016 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217988 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.139 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.139 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.140 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.140 06:12:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:22.141 06:12:51 -- setup/common.sh@33 -- # echo 0 00:03:22.141 06:12:51 -- setup/common.sh@33 -- # return 0 00:03:22.141 06:12:51 -- setup/hugepages.sh@100 -- # resv=0 00:03:22.141 06:12:51 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:22.141 nr_hugepages=1024 00:03:22.141 06:12:51 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:22.141 resv_hugepages=0 00:03:22.141 06:12:51 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:22.141 surplus_hugepages=0 00:03:22.141 06:12:51 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:22.141 anon_hugepages=0 00:03:22.141 06:12:51 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:22.141 06:12:51 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:22.141 06:12:51 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:22.141 06:12:51 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:22.141 06:12:51 -- setup/common.sh@18 -- # local node= 00:03:22.141 06:12:51 -- setup/common.sh@19 -- # local var val 00:03:22.141 06:12:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:22.141 06:12:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:22.141 06:12:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:22.141 06:12:51 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:22.141 06:12:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:22.141 06:12:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43653640 kB' 'MemAvailable: 45234624 kB' 'Buffers: 4384 kB' 'Cached: 9640676 kB' 'SwapCached: 76 kB' 'Active: 6676152 kB' 'Inactive: 3555396 kB' 'Active(anon): 5763652 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589912 kB' 'Mapped: 178684 kB' 'Shmem: 7893528 kB' 'KReclaimable: 567212 kB' 'Slab: 1559084 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991872 kB' 'KernelStack: 22016 kB' 'PageTables: 9172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10056032 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.141 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.141 06:12:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.142 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.142 06:12:51 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.143 06:12:51 -- setup/common.sh@33 -- # echo 1024 00:03:22.143 06:12:51 -- setup/common.sh@33 -- # return 0 00:03:22.143 06:12:51 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:22.143 06:12:51 -- setup/hugepages.sh@112 -- # get_nodes 00:03:22.143 06:12:51 -- setup/hugepages.sh@27 -- # local node 00:03:22.143 06:12:51 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:22.143 06:12:51 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:22.143 06:12:51 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:22.143 06:12:51 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:22.143 06:12:51 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:22.143 06:12:51 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:22.143 06:12:51 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:22.143 06:12:51 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:22.143 06:12:51 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:22.143 06:12:51 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:22.143 06:12:51 -- setup/common.sh@18 -- # local node=0 00:03:22.143 06:12:51 -- setup/common.sh@19 -- # local var val 00:03:22.143 06:12:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:22.143 06:12:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:22.143 06:12:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:22.143 06:12:51 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:22.143 06:12:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:22.143 06:12:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23571776 kB' 'MemUsed: 9062660 kB' 'SwapCached: 44 kB' 'Active: 4297000 kB' 'Inactive: 532564 kB' 'Active(anon): 3519572 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4590556 kB' 'Mapped: 118968 kB' 'AnonPages: 242400 kB' 'Shmem: 3280576 kB' 'KernelStack: 10952 kB' 'PageTables: 5024 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 394740 kB' 'Slab: 873272 kB' 'SReclaimable: 394740 kB' 'SUnreclaim: 478532 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.143 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.143 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # continue 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.144 06:12:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.144 06:12:51 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.144 06:12:51 -- setup/common.sh@33 -- # echo 0 00:03:22.145 06:12:51 -- setup/common.sh@33 -- # return 0 00:03:22.145 06:12:51 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:22.145 06:12:51 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:22.145 06:12:51 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:22.145 06:12:51 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:22.145 06:12:51 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:22.145 node0=1024 expecting 1024 00:03:22.145 06:12:51 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:22.145 00:03:22.145 real 0m5.257s 00:03:22.145 user 0m1.405s 00:03:22.145 sys 0m2.323s 00:03:22.145 06:12:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:22.145 06:12:51 -- common/autotest_common.sh@10 -- # set +x 00:03:22.145 ************************************ 00:03:22.145 END TEST default_setup 00:03:22.145 ************************************ 00:03:22.145 06:12:51 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:22.145 06:12:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:22.145 06:12:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:22.145 06:12:51 -- common/autotest_common.sh@10 -- # set +x 00:03:22.145 ************************************ 00:03:22.145 START TEST per_node_1G_alloc 00:03:22.145 ************************************ 00:03:22.145 06:12:51 -- common/autotest_common.sh@1114 -- # per_node_1G_alloc 00:03:22.145 06:12:51 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:22.145 06:12:51 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:22.145 06:12:51 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:22.145 06:12:51 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:22.145 06:12:51 -- setup/hugepages.sh@51 -- # shift 00:03:22.145 06:12:51 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:22.145 06:12:51 -- setup/hugepages.sh@52 -- # local node_ids 00:03:22.145 06:12:51 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:22.145 06:12:51 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:22.145 06:12:51 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:22.145 06:12:51 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:22.145 06:12:51 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:22.145 06:12:51 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:22.145 06:12:51 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:22.145 06:12:51 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:22.145 06:12:51 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:22.145 06:12:51 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:22.145 06:12:51 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:22.145 06:12:51 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:22.145 06:12:51 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:22.145 06:12:51 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:22.145 06:12:51 -- setup/hugepages.sh@73 -- # return 0 00:03:22.145 06:12:51 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:22.145 06:12:51 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:22.145 06:12:51 -- setup/hugepages.sh@146 -- # setup output 00:03:22.145 06:12:51 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:22.145 06:12:51 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:25.443 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:25.443 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:25.443 06:12:54 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:25.443 06:12:54 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:25.443 06:12:54 -- setup/hugepages.sh@89 -- # local node 00:03:25.443 06:12:54 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:25.443 06:12:54 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:25.443 06:12:54 -- setup/hugepages.sh@92 -- # local surp 00:03:25.443 06:12:54 -- setup/hugepages.sh@93 -- # local resv 00:03:25.443 06:12:54 -- setup/hugepages.sh@94 -- # local anon 00:03:25.443 06:12:54 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:25.443 06:12:54 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:25.443 06:12:54 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:25.443 06:12:54 -- setup/common.sh@18 -- # local node= 00:03:25.443 06:12:54 -- setup/common.sh@19 -- # local var val 00:03:25.443 06:12:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:25.443 06:12:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:25.443 06:12:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:25.443 06:12:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:25.443 06:12:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:25.443 06:12:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:25.443 06:12:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43626220 kB' 'MemAvailable: 45207204 kB' 'Buffers: 4384 kB' 'Cached: 9640760 kB' 'SwapCached: 76 kB' 'Active: 6677160 kB' 'Inactive: 3555396 kB' 'Active(anon): 5764660 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590648 kB' 'Mapped: 178728 kB' 'Shmem: 7893612 kB' 'KReclaimable: 567212 kB' 'Slab: 1559284 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 992072 kB' 'KernelStack: 21888 kB' 'PageTables: 8960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10052336 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218228 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:25.443 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.443 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.443 06:12:54 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.443 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.443 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.443 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.443 06:12:54 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.443 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.443 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.443 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.443 06:12:54 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.443 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.443 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.443 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.443 06:12:54 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.443 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.443 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.443 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.443 06:12:54 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.444 06:12:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.444 06:12:54 -- setup/common.sh@33 -- # echo 0 00:03:25.444 06:12:54 -- setup/common.sh@33 -- # return 0 00:03:25.444 06:12:54 -- setup/hugepages.sh@97 -- # anon=0 00:03:25.444 06:12:54 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:25.444 06:12:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:25.444 06:12:54 -- setup/common.sh@18 -- # local node= 00:03:25.444 06:12:54 -- setup/common.sh@19 -- # local var val 00:03:25.444 06:12:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:25.444 06:12:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:25.444 06:12:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:25.444 06:12:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:25.444 06:12:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:25.444 06:12:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.444 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43634312 kB' 'MemAvailable: 45215296 kB' 'Buffers: 4384 kB' 'Cached: 9640772 kB' 'SwapCached: 76 kB' 'Active: 6676420 kB' 'Inactive: 3555396 kB' 'Active(anon): 5763920 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589952 kB' 'Mapped: 178692 kB' 'Shmem: 7893624 kB' 'KReclaimable: 567212 kB' 'Slab: 1559496 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 992284 kB' 'KernelStack: 21872 kB' 'PageTables: 8712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10052348 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218116 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.445 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.445 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.446 06:12:54 -- setup/common.sh@33 -- # echo 0 00:03:25.446 06:12:54 -- setup/common.sh@33 -- # return 0 00:03:25.446 06:12:54 -- setup/hugepages.sh@99 -- # surp=0 00:03:25.446 06:12:54 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:25.446 06:12:54 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:25.446 06:12:54 -- setup/common.sh@18 -- # local node= 00:03:25.446 06:12:54 -- setup/common.sh@19 -- # local var val 00:03:25.446 06:12:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:25.446 06:12:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:25.446 06:12:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:25.446 06:12:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:25.446 06:12:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:25.446 06:12:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43635124 kB' 'MemAvailable: 45216108 kB' 'Buffers: 4384 kB' 'Cached: 9640776 kB' 'SwapCached: 76 kB' 'Active: 6675472 kB' 'Inactive: 3555396 kB' 'Active(anon): 5762972 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 588964 kB' 'Mapped: 177532 kB' 'Shmem: 7893628 kB' 'KReclaimable: 567212 kB' 'Slab: 1559496 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 992284 kB' 'KernelStack: 21840 kB' 'PageTables: 8556 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10044780 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.446 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.446 06:12:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.447 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.447 06:12:54 -- setup/common.sh@33 -- # echo 0 00:03:25.447 06:12:54 -- setup/common.sh@33 -- # return 0 00:03:25.447 06:12:54 -- setup/hugepages.sh@100 -- # resv=0 00:03:25.447 06:12:54 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:25.447 nr_hugepages=1024 00:03:25.447 06:12:54 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:25.447 resv_hugepages=0 00:03:25.447 06:12:54 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:25.447 surplus_hugepages=0 00:03:25.447 06:12:54 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:25.447 anon_hugepages=0 00:03:25.447 06:12:54 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:25.447 06:12:54 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:25.447 06:12:54 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:25.447 06:12:54 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:25.447 06:12:54 -- setup/common.sh@18 -- # local node= 00:03:25.447 06:12:54 -- setup/common.sh@19 -- # local var val 00:03:25.447 06:12:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:25.447 06:12:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:25.447 06:12:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:25.447 06:12:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:25.447 06:12:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:25.447 06:12:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.447 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43637080 kB' 'MemAvailable: 45218064 kB' 'Buffers: 4384 kB' 'Cached: 9640800 kB' 'SwapCached: 76 kB' 'Active: 6675112 kB' 'Inactive: 3555396 kB' 'Active(anon): 5762612 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 588540 kB' 'Mapped: 177532 kB' 'Shmem: 7893652 kB' 'KReclaimable: 567212 kB' 'Slab: 1559496 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 992284 kB' 'KernelStack: 21824 kB' 'PageTables: 8500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10044792 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.448 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.448 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.449 06:12:54 -- setup/common.sh@33 -- # echo 1024 00:03:25.449 06:12:54 -- setup/common.sh@33 -- # return 0 00:03:25.449 06:12:54 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:25.449 06:12:54 -- setup/hugepages.sh@112 -- # get_nodes 00:03:25.449 06:12:54 -- setup/hugepages.sh@27 -- # local node 00:03:25.449 06:12:54 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:25.449 06:12:54 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:25.449 06:12:54 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:25.449 06:12:54 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:25.449 06:12:54 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:25.449 06:12:54 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:25.449 06:12:54 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:25.449 06:12:54 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:25.449 06:12:54 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:25.449 06:12:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:25.449 06:12:54 -- setup/common.sh@18 -- # local node=0 00:03:25.449 06:12:54 -- setup/common.sh@19 -- # local var val 00:03:25.449 06:12:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:25.449 06:12:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:25.449 06:12:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:25.449 06:12:54 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:25.449 06:12:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:25.449 06:12:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24586996 kB' 'MemUsed: 8047440 kB' 'SwapCached: 44 kB' 'Active: 4296500 kB' 'Inactive: 532564 kB' 'Active(anon): 3519072 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4590632 kB' 'Mapped: 118164 kB' 'AnonPages: 241628 kB' 'Shmem: 3280652 kB' 'KernelStack: 10904 kB' 'PageTables: 4716 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 394740 kB' 'Slab: 873904 kB' 'SReclaimable: 394740 kB' 'SUnreclaim: 479164 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.449 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.449 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@33 -- # echo 0 00:03:25.450 06:12:54 -- setup/common.sh@33 -- # return 0 00:03:25.450 06:12:54 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:25.450 06:12:54 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:25.450 06:12:54 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:25.450 06:12:54 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:25.450 06:12:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:25.450 06:12:54 -- setup/common.sh@18 -- # local node=1 00:03:25.450 06:12:54 -- setup/common.sh@19 -- # local var val 00:03:25.450 06:12:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:25.450 06:12:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:25.450 06:12:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:25.450 06:12:54 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:25.450 06:12:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:25.450 06:12:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 19051332 kB' 'MemUsed: 8598028 kB' 'SwapCached: 32 kB' 'Active: 2378980 kB' 'Inactive: 3022832 kB' 'Active(anon): 2243908 kB' 'Inactive(anon): 2716308 kB' 'Active(file): 135072 kB' 'Inactive(file): 306524 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5054632 kB' 'Mapped: 59368 kB' 'AnonPages: 347340 kB' 'Shmem: 4613004 kB' 'KernelStack: 10936 kB' 'PageTables: 3840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 172472 kB' 'Slab: 685592 kB' 'SReclaimable: 172472 kB' 'SUnreclaim: 513120 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.450 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.450 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.451 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.451 06:12:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.451 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.451 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.451 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.451 06:12:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.451 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.451 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.451 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.451 06:12:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.451 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # continue 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.711 06:12:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.711 06:12:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.711 06:12:54 -- setup/common.sh@33 -- # echo 0 00:03:25.711 06:12:54 -- setup/common.sh@33 -- # return 0 00:03:25.711 06:12:54 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:25.711 06:12:54 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:25.711 06:12:54 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:25.711 06:12:54 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:25.711 06:12:54 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:25.711 node0=512 expecting 512 00:03:25.711 06:12:54 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:25.711 06:12:54 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:25.711 06:12:54 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:25.711 06:12:54 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:25.711 node1=512 expecting 512 00:03:25.711 06:12:54 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:25.711 00:03:25.711 real 0m3.446s 00:03:25.711 user 0m1.226s 00:03:25.711 sys 0m2.241s 00:03:25.711 06:12:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:25.711 06:12:54 -- common/autotest_common.sh@10 -- # set +x 00:03:25.711 ************************************ 00:03:25.711 END TEST per_node_1G_alloc 00:03:25.711 ************************************ 00:03:25.711 06:12:55 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:25.711 06:12:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:25.711 06:12:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:25.711 06:12:55 -- common/autotest_common.sh@10 -- # set +x 00:03:25.711 ************************************ 00:03:25.711 START TEST even_2G_alloc 00:03:25.711 ************************************ 00:03:25.711 06:12:55 -- common/autotest_common.sh@1114 -- # even_2G_alloc 00:03:25.711 06:12:55 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:25.711 06:12:55 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:25.711 06:12:55 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:25.711 06:12:55 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:25.711 06:12:55 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:25.711 06:12:55 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:25.711 06:12:55 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:25.711 06:12:55 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:25.711 06:12:55 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:25.711 06:12:55 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:25.711 06:12:55 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:25.711 06:12:55 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:25.711 06:12:55 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:25.711 06:12:55 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:25.711 06:12:55 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:25.711 06:12:55 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:25.711 06:12:55 -- setup/hugepages.sh@83 -- # : 512 00:03:25.711 06:12:55 -- setup/hugepages.sh@84 -- # : 1 00:03:25.711 06:12:55 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:25.711 06:12:55 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:25.711 06:12:55 -- setup/hugepages.sh@83 -- # : 0 00:03:25.711 06:12:55 -- setup/hugepages.sh@84 -- # : 0 00:03:25.711 06:12:55 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:25.711 06:12:55 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:25.712 06:12:55 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:25.712 06:12:55 -- setup/hugepages.sh@153 -- # setup output 00:03:25.712 06:12:55 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:25.712 06:12:55 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:29.006 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:29.006 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:29.006 06:12:58 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:29.006 06:12:58 -- setup/hugepages.sh@89 -- # local node 00:03:29.006 06:12:58 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:29.006 06:12:58 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:29.006 06:12:58 -- setup/hugepages.sh@92 -- # local surp 00:03:29.006 06:12:58 -- setup/hugepages.sh@93 -- # local resv 00:03:29.006 06:12:58 -- setup/hugepages.sh@94 -- # local anon 00:03:29.006 06:12:58 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:29.006 06:12:58 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:29.006 06:12:58 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:29.006 06:12:58 -- setup/common.sh@18 -- # local node= 00:03:29.006 06:12:58 -- setup/common.sh@19 -- # local var val 00:03:29.006 06:12:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:29.006 06:12:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.006 06:12:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:29.006 06:12:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:29.006 06:12:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.006 06:12:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.006 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.006 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.006 06:12:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43684916 kB' 'MemAvailable: 45265900 kB' 'Buffers: 4384 kB' 'Cached: 9640892 kB' 'SwapCached: 76 kB' 'Active: 6677088 kB' 'Inactive: 3555396 kB' 'Active(anon): 5764588 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589936 kB' 'Mapped: 177620 kB' 'Shmem: 7893744 kB' 'KReclaimable: 567212 kB' 'Slab: 1558348 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991136 kB' 'KernelStack: 21856 kB' 'PageTables: 8628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10045400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217972 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:29.006 06:12:58 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.006 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.006 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.006 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.006 06:12:58 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.006 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.006 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.006 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.006 06:12:58 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.006 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.006 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.006 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.006 06:12:58 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.006 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.273 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.273 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.274 06:12:58 -- setup/common.sh@33 -- # echo 0 00:03:29.274 06:12:58 -- setup/common.sh@33 -- # return 0 00:03:29.274 06:12:58 -- setup/hugepages.sh@97 -- # anon=0 00:03:29.274 06:12:58 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:29.274 06:12:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:29.274 06:12:58 -- setup/common.sh@18 -- # local node= 00:03:29.274 06:12:58 -- setup/common.sh@19 -- # local var val 00:03:29.274 06:12:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:29.274 06:12:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.274 06:12:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:29.274 06:12:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:29.274 06:12:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.274 06:12:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43684664 kB' 'MemAvailable: 45265648 kB' 'Buffers: 4384 kB' 'Cached: 9640896 kB' 'SwapCached: 76 kB' 'Active: 6676832 kB' 'Inactive: 3555396 kB' 'Active(anon): 5764332 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589716 kB' 'Mapped: 177620 kB' 'Shmem: 7893748 kB' 'KReclaimable: 567212 kB' 'Slab: 1558348 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991136 kB' 'KernelStack: 21856 kB' 'PageTables: 8632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10045412 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217956 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.274 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.274 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.275 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.275 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.275 06:12:58 -- setup/common.sh@33 -- # echo 0 00:03:29.275 06:12:58 -- setup/common.sh@33 -- # return 0 00:03:29.275 06:12:58 -- setup/hugepages.sh@99 -- # surp=0 00:03:29.275 06:12:58 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:29.276 06:12:58 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:29.276 06:12:58 -- setup/common.sh@18 -- # local node= 00:03:29.276 06:12:58 -- setup/common.sh@19 -- # local var val 00:03:29.276 06:12:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:29.276 06:12:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.276 06:12:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:29.276 06:12:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:29.276 06:12:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.276 06:12:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43684664 kB' 'MemAvailable: 45265648 kB' 'Buffers: 4384 kB' 'Cached: 9640896 kB' 'SwapCached: 76 kB' 'Active: 6676192 kB' 'Inactive: 3555396 kB' 'Active(anon): 5763692 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589524 kB' 'Mapped: 177536 kB' 'Shmem: 7893748 kB' 'KReclaimable: 567212 kB' 'Slab: 1558328 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991116 kB' 'KernelStack: 21856 kB' 'PageTables: 8612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10045428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217956 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.276 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.276 06:12:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.277 06:12:58 -- setup/common.sh@33 -- # echo 0 00:03:29.277 06:12:58 -- setup/common.sh@33 -- # return 0 00:03:29.277 06:12:58 -- setup/hugepages.sh@100 -- # resv=0 00:03:29.277 06:12:58 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:29.277 nr_hugepages=1024 00:03:29.277 06:12:58 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:29.277 resv_hugepages=0 00:03:29.277 06:12:58 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:29.277 surplus_hugepages=0 00:03:29.277 06:12:58 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:29.277 anon_hugepages=0 00:03:29.277 06:12:58 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:29.277 06:12:58 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:29.277 06:12:58 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:29.277 06:12:58 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:29.277 06:12:58 -- setup/common.sh@18 -- # local node= 00:03:29.277 06:12:58 -- setup/common.sh@19 -- # local var val 00:03:29.277 06:12:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:29.277 06:12:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.277 06:12:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:29.277 06:12:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:29.277 06:12:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.277 06:12:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43684160 kB' 'MemAvailable: 45265144 kB' 'Buffers: 4384 kB' 'Cached: 9640900 kB' 'SwapCached: 76 kB' 'Active: 6676336 kB' 'Inactive: 3555396 kB' 'Active(anon): 5763836 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589664 kB' 'Mapped: 177536 kB' 'Shmem: 7893752 kB' 'KReclaimable: 567212 kB' 'Slab: 1558328 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991116 kB' 'KernelStack: 21840 kB' 'PageTables: 8556 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10045444 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217956 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.277 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.277 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.278 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.278 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.279 06:12:58 -- setup/common.sh@33 -- # echo 1024 00:03:29.279 06:12:58 -- setup/common.sh@33 -- # return 0 00:03:29.279 06:12:58 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:29.279 06:12:58 -- setup/hugepages.sh@112 -- # get_nodes 00:03:29.279 06:12:58 -- setup/hugepages.sh@27 -- # local node 00:03:29.279 06:12:58 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:29.279 06:12:58 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:29.279 06:12:58 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:29.279 06:12:58 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:29.279 06:12:58 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:29.279 06:12:58 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:29.279 06:12:58 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:29.279 06:12:58 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:29.279 06:12:58 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:29.279 06:12:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:29.279 06:12:58 -- setup/common.sh@18 -- # local node=0 00:03:29.279 06:12:58 -- setup/common.sh@19 -- # local var val 00:03:29.279 06:12:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:29.279 06:12:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.279 06:12:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:29.279 06:12:58 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:29.279 06:12:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.279 06:12:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24602364 kB' 'MemUsed: 8032072 kB' 'SwapCached: 44 kB' 'Active: 4296516 kB' 'Inactive: 532564 kB' 'Active(anon): 3519088 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4590716 kB' 'Mapped: 118168 kB' 'AnonPages: 241592 kB' 'Shmem: 3280736 kB' 'KernelStack: 10888 kB' 'PageTables: 4724 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 394740 kB' 'Slab: 873020 kB' 'SReclaimable: 394740 kB' 'SUnreclaim: 478280 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.279 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.279 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@33 -- # echo 0 00:03:29.280 06:12:58 -- setup/common.sh@33 -- # return 0 00:03:29.280 06:12:58 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:29.280 06:12:58 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:29.280 06:12:58 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:29.280 06:12:58 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:29.280 06:12:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:29.280 06:12:58 -- setup/common.sh@18 -- # local node=1 00:03:29.280 06:12:58 -- setup/common.sh@19 -- # local var val 00:03:29.280 06:12:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:29.280 06:12:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.280 06:12:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:29.280 06:12:58 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:29.280 06:12:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.280 06:12:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 19082476 kB' 'MemUsed: 8566884 kB' 'SwapCached: 32 kB' 'Active: 2379716 kB' 'Inactive: 3022832 kB' 'Active(anon): 2244644 kB' 'Inactive(anon): 2716308 kB' 'Active(file): 135072 kB' 'Inactive(file): 306524 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5054680 kB' 'Mapped: 59368 kB' 'AnonPages: 347908 kB' 'Shmem: 4613052 kB' 'KernelStack: 10952 kB' 'PageTables: 3832 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 172472 kB' 'Slab: 685312 kB' 'SReclaimable: 172472 kB' 'SUnreclaim: 512840 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.280 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.280 06:12:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # continue 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.281 06:12:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.281 06:12:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.281 06:12:58 -- setup/common.sh@33 -- # echo 0 00:03:29.281 06:12:58 -- setup/common.sh@33 -- # return 0 00:03:29.281 06:12:58 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:29.281 06:12:58 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:29.281 06:12:58 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:29.281 06:12:58 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:29.281 06:12:58 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:29.281 node0=512 expecting 512 00:03:29.281 06:12:58 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:29.281 06:12:58 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:29.281 06:12:58 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:29.281 06:12:58 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:29.281 node1=512 expecting 512 00:03:29.281 06:12:58 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:29.281 00:03:29.281 real 0m3.665s 00:03:29.281 user 0m1.351s 00:03:29.281 sys 0m2.382s 00:03:29.281 06:12:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:29.281 06:12:58 -- common/autotest_common.sh@10 -- # set +x 00:03:29.281 ************************************ 00:03:29.281 END TEST even_2G_alloc 00:03:29.281 ************************************ 00:03:29.281 06:12:58 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:29.281 06:12:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:29.281 06:12:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:29.281 06:12:58 -- common/autotest_common.sh@10 -- # set +x 00:03:29.281 ************************************ 00:03:29.281 START TEST odd_alloc 00:03:29.281 ************************************ 00:03:29.281 06:12:58 -- common/autotest_common.sh@1114 -- # odd_alloc 00:03:29.281 06:12:58 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:29.281 06:12:58 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:29.281 06:12:58 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:29.281 06:12:58 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:29.281 06:12:58 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:29.281 06:12:58 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:29.281 06:12:58 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:29.281 06:12:58 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:29.281 06:12:58 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:29.281 06:12:58 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:29.281 06:12:58 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:29.281 06:12:58 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:29.281 06:12:58 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:29.281 06:12:58 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:29.281 06:12:58 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:29.281 06:12:58 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:29.281 06:12:58 -- setup/hugepages.sh@83 -- # : 513 00:03:29.281 06:12:58 -- setup/hugepages.sh@84 -- # : 1 00:03:29.281 06:12:58 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:29.281 06:12:58 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:29.281 06:12:58 -- setup/hugepages.sh@83 -- # : 0 00:03:29.281 06:12:58 -- setup/hugepages.sh@84 -- # : 0 00:03:29.281 06:12:58 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:29.281 06:12:58 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:29.281 06:12:58 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:29.281 06:12:58 -- setup/hugepages.sh@160 -- # setup output 00:03:29.281 06:12:58 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:29.281 06:12:58 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:32.573 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:32.573 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:32.839 06:13:02 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:32.839 06:13:02 -- setup/hugepages.sh@89 -- # local node 00:03:32.839 06:13:02 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:32.839 06:13:02 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:32.839 06:13:02 -- setup/hugepages.sh@92 -- # local surp 00:03:32.839 06:13:02 -- setup/hugepages.sh@93 -- # local resv 00:03:32.839 06:13:02 -- setup/hugepages.sh@94 -- # local anon 00:03:32.839 06:13:02 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:32.839 06:13:02 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:32.839 06:13:02 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:32.839 06:13:02 -- setup/common.sh@18 -- # local node= 00:03:32.839 06:13:02 -- setup/common.sh@19 -- # local var val 00:03:32.839 06:13:02 -- setup/common.sh@20 -- # local mem_f mem 00:03:32.839 06:13:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:32.839 06:13:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:32.839 06:13:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:32.839 06:13:02 -- setup/common.sh@28 -- # mapfile -t mem 00:03:32.839 06:13:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43702592 kB' 'MemAvailable: 45283576 kB' 'Buffers: 4384 kB' 'Cached: 9641020 kB' 'SwapCached: 76 kB' 'Active: 6675816 kB' 'Inactive: 3555396 kB' 'Active(anon): 5763316 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589608 kB' 'Mapped: 177652 kB' 'Shmem: 7893872 kB' 'KReclaimable: 567212 kB' 'Slab: 1557480 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 990268 kB' 'KernelStack: 21952 kB' 'PageTables: 9056 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10049088 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218244 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.839 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.839 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:32.840 06:13:02 -- setup/common.sh@33 -- # echo 0 00:03:32.840 06:13:02 -- setup/common.sh@33 -- # return 0 00:03:32.840 06:13:02 -- setup/hugepages.sh@97 -- # anon=0 00:03:32.840 06:13:02 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:32.840 06:13:02 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:32.840 06:13:02 -- setup/common.sh@18 -- # local node= 00:03:32.840 06:13:02 -- setup/common.sh@19 -- # local var val 00:03:32.840 06:13:02 -- setup/common.sh@20 -- # local mem_f mem 00:03:32.840 06:13:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:32.840 06:13:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:32.840 06:13:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:32.840 06:13:02 -- setup/common.sh@28 -- # mapfile -t mem 00:03:32.840 06:13:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43701080 kB' 'MemAvailable: 45282064 kB' 'Buffers: 4384 kB' 'Cached: 9641024 kB' 'SwapCached: 76 kB' 'Active: 6676504 kB' 'Inactive: 3555396 kB' 'Active(anon): 5764004 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589696 kB' 'Mapped: 177536 kB' 'Shmem: 7893876 kB' 'KReclaimable: 567212 kB' 'Slab: 1557448 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 990236 kB' 'KernelStack: 21952 kB' 'PageTables: 8824 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10050616 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218196 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.840 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.840 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.841 06:13:02 -- setup/common.sh@33 -- # echo 0 00:03:32.841 06:13:02 -- setup/common.sh@33 -- # return 0 00:03:32.841 06:13:02 -- setup/hugepages.sh@99 -- # surp=0 00:03:32.841 06:13:02 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:32.841 06:13:02 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:32.841 06:13:02 -- setup/common.sh@18 -- # local node= 00:03:32.841 06:13:02 -- setup/common.sh@19 -- # local var val 00:03:32.841 06:13:02 -- setup/common.sh@20 -- # local mem_f mem 00:03:32.841 06:13:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:32.841 06:13:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:32.841 06:13:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:32.841 06:13:02 -- setup/common.sh@28 -- # mapfile -t mem 00:03:32.841 06:13:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.841 06:13:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43701172 kB' 'MemAvailable: 45282156 kB' 'Buffers: 4384 kB' 'Cached: 9641036 kB' 'SwapCached: 76 kB' 'Active: 6676452 kB' 'Inactive: 3555396 kB' 'Active(anon): 5763952 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589640 kB' 'Mapped: 177536 kB' 'Shmem: 7893888 kB' 'KReclaimable: 567212 kB' 'Slab: 1557448 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 990236 kB' 'KernelStack: 22048 kB' 'PageTables: 9044 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10050628 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218276 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.841 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.841 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.842 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.842 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:32.843 06:13:02 -- setup/common.sh@33 -- # echo 0 00:03:32.843 06:13:02 -- setup/common.sh@33 -- # return 0 00:03:32.843 06:13:02 -- setup/hugepages.sh@100 -- # resv=0 00:03:32.843 06:13:02 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:32.843 nr_hugepages=1025 00:03:32.843 06:13:02 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:32.843 resv_hugepages=0 00:03:32.843 06:13:02 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:32.843 surplus_hugepages=0 00:03:32.843 06:13:02 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:32.843 anon_hugepages=0 00:03:32.843 06:13:02 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:32.843 06:13:02 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:32.843 06:13:02 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:32.843 06:13:02 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:32.843 06:13:02 -- setup/common.sh@18 -- # local node= 00:03:32.843 06:13:02 -- setup/common.sh@19 -- # local var val 00:03:32.843 06:13:02 -- setup/common.sh@20 -- # local mem_f mem 00:03:32.843 06:13:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:32.843 06:13:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:32.843 06:13:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:32.843 06:13:02 -- setup/common.sh@28 -- # mapfile -t mem 00:03:32.843 06:13:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43701264 kB' 'MemAvailable: 45282248 kB' 'Buffers: 4384 kB' 'Cached: 9641052 kB' 'SwapCached: 76 kB' 'Active: 6676612 kB' 'Inactive: 3555396 kB' 'Active(anon): 5764112 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589800 kB' 'Mapped: 177536 kB' 'Shmem: 7893904 kB' 'KReclaimable: 567212 kB' 'Slab: 1557448 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 990236 kB' 'KernelStack: 21792 kB' 'PageTables: 8488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10049128 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218148 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.843 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.843 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.844 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:32.844 06:13:02 -- setup/common.sh@33 -- # echo 1025 00:03:32.844 06:13:02 -- setup/common.sh@33 -- # return 0 00:03:32.844 06:13:02 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:32.844 06:13:02 -- setup/hugepages.sh@112 -- # get_nodes 00:03:32.844 06:13:02 -- setup/hugepages.sh@27 -- # local node 00:03:32.844 06:13:02 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:32.844 06:13:02 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:32.844 06:13:02 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:32.844 06:13:02 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:32.844 06:13:02 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:32.844 06:13:02 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:32.844 06:13:02 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:32.844 06:13:02 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:32.844 06:13:02 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:32.844 06:13:02 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:32.844 06:13:02 -- setup/common.sh@18 -- # local node=0 00:03:32.844 06:13:02 -- setup/common.sh@19 -- # local var val 00:03:32.844 06:13:02 -- setup/common.sh@20 -- # local mem_f mem 00:03:32.844 06:13:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:32.844 06:13:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:32.844 06:13:02 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:32.844 06:13:02 -- setup/common.sh@28 -- # mapfile -t mem 00:03:32.844 06:13:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.844 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24603968 kB' 'MemUsed: 8030468 kB' 'SwapCached: 44 kB' 'Active: 4297256 kB' 'Inactive: 532564 kB' 'Active(anon): 3519828 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4590804 kB' 'Mapped: 118168 kB' 'AnonPages: 242296 kB' 'Shmem: 3280824 kB' 'KernelStack: 10984 kB' 'PageTables: 4956 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 394740 kB' 'Slab: 872464 kB' 'SReclaimable: 394740 kB' 'SUnreclaim: 477724 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.845 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.845 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.845 06:13:02 -- setup/common.sh@33 -- # echo 0 00:03:32.845 06:13:02 -- setup/common.sh@33 -- # return 0 00:03:32.845 06:13:02 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:32.845 06:13:02 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:32.845 06:13:02 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:32.845 06:13:02 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:32.845 06:13:02 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:32.845 06:13:02 -- setup/common.sh@18 -- # local node=1 00:03:32.845 06:13:02 -- setup/common.sh@19 -- # local var val 00:03:32.845 06:13:02 -- setup/common.sh@20 -- # local mem_f mem 00:03:32.845 06:13:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:32.846 06:13:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:32.846 06:13:02 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:32.846 06:13:02 -- setup/common.sh@28 -- # mapfile -t mem 00:03:32.846 06:13:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 19096316 kB' 'MemUsed: 8553044 kB' 'SwapCached: 32 kB' 'Active: 2379624 kB' 'Inactive: 3022832 kB' 'Active(anon): 2244552 kB' 'Inactive(anon): 2716308 kB' 'Active(file): 135072 kB' 'Inactive(file): 306524 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5054720 kB' 'Mapped: 59368 kB' 'AnonPages: 347760 kB' 'Shmem: 4613092 kB' 'KernelStack: 11000 kB' 'PageTables: 3660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 172472 kB' 'Slab: 684984 kB' 'SReclaimable: 172472 kB' 'SUnreclaim: 512512 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:32.846 06:13:02 -- setup/common.sh@32 -- # continue 00:03:32.846 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.106 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.106 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.106 06:13:02 -- setup/common.sh@32 -- # continue 00:03:33.106 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.106 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.106 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.106 06:13:02 -- setup/common.sh@32 -- # continue 00:03:33.106 06:13:02 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.106 06:13:02 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.106 06:13:02 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.106 06:13:02 -- setup/common.sh@33 -- # echo 0 00:03:33.106 06:13:02 -- setup/common.sh@33 -- # return 0 00:03:33.106 06:13:02 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:33.106 06:13:02 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:33.106 06:13:02 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:33.106 06:13:02 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:33.106 06:13:02 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:33.106 node0=512 expecting 513 00:03:33.106 06:13:02 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:33.106 06:13:02 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:33.106 06:13:02 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:33.106 06:13:02 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:33.106 node1=513 expecting 512 00:03:33.106 06:13:02 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:33.106 00:03:33.106 real 0m3.631s 00:03:33.106 user 0m1.304s 00:03:33.106 sys 0m2.396s 00:03:33.106 06:13:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:33.106 06:13:02 -- common/autotest_common.sh@10 -- # set +x 00:03:33.106 ************************************ 00:03:33.106 END TEST odd_alloc 00:03:33.106 ************************************ 00:03:33.106 06:13:02 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:33.106 06:13:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:33.106 06:13:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:33.106 06:13:02 -- common/autotest_common.sh@10 -- # set +x 00:03:33.106 ************************************ 00:03:33.106 START TEST custom_alloc 00:03:33.106 ************************************ 00:03:33.106 06:13:02 -- common/autotest_common.sh@1114 -- # custom_alloc 00:03:33.106 06:13:02 -- setup/hugepages.sh@167 -- # local IFS=, 00:03:33.106 06:13:02 -- setup/hugepages.sh@169 -- # local node 00:03:33.106 06:13:02 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:33.106 06:13:02 -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:33.106 06:13:02 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:33.106 06:13:02 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:33.106 06:13:02 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:33.106 06:13:02 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:33.106 06:13:02 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:33.106 06:13:02 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:33.106 06:13:02 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:33.106 06:13:02 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:33.106 06:13:02 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:33.106 06:13:02 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:33.106 06:13:02 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:33.106 06:13:02 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:33.106 06:13:02 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:33.106 06:13:02 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:33.106 06:13:02 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:33.106 06:13:02 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:33.106 06:13:02 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:33.106 06:13:02 -- setup/hugepages.sh@83 -- # : 256 00:03:33.106 06:13:02 -- setup/hugepages.sh@84 -- # : 1 00:03:33.106 06:13:02 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:33.106 06:13:02 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:33.106 06:13:02 -- setup/hugepages.sh@83 -- # : 0 00:03:33.106 06:13:02 -- setup/hugepages.sh@84 -- # : 0 00:03:33.106 06:13:02 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:33.107 06:13:02 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:33.107 06:13:02 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:33.107 06:13:02 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:33.107 06:13:02 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:33.107 06:13:02 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:33.107 06:13:02 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:33.107 06:13:02 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:33.107 06:13:02 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:33.107 06:13:02 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:33.107 06:13:02 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:33.107 06:13:02 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:33.107 06:13:02 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:33.107 06:13:02 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:33.107 06:13:02 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:33.107 06:13:02 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:33.107 06:13:02 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:33.107 06:13:02 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:33.107 06:13:02 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:33.107 06:13:02 -- setup/hugepages.sh@78 -- # return 0 00:03:33.107 06:13:02 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:33.107 06:13:02 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:33.107 06:13:02 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:33.107 06:13:02 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:33.107 06:13:02 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:33.107 06:13:02 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:33.107 06:13:02 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:33.107 06:13:02 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:33.107 06:13:02 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:33.107 06:13:02 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:33.107 06:13:02 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:33.107 06:13:02 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:33.107 06:13:02 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:33.107 06:13:02 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:33.107 06:13:02 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:33.107 06:13:02 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:33.107 06:13:02 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:33.107 06:13:02 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:33.107 06:13:02 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:33.107 06:13:02 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:33.107 06:13:02 -- setup/hugepages.sh@78 -- # return 0 00:03:33.107 06:13:02 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:33.107 06:13:02 -- setup/hugepages.sh@187 -- # setup output 00:03:33.107 06:13:02 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:33.107 06:13:02 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:36.403 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:36.403 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:36.403 06:13:05 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:36.403 06:13:05 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:36.403 06:13:05 -- setup/hugepages.sh@89 -- # local node 00:03:36.403 06:13:05 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:36.403 06:13:05 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:36.403 06:13:05 -- setup/hugepages.sh@92 -- # local surp 00:03:36.403 06:13:05 -- setup/hugepages.sh@93 -- # local resv 00:03:36.403 06:13:05 -- setup/hugepages.sh@94 -- # local anon 00:03:36.403 06:13:05 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:36.403 06:13:05 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:36.403 06:13:05 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:36.403 06:13:05 -- setup/common.sh@18 -- # local node= 00:03:36.403 06:13:05 -- setup/common.sh@19 -- # local var val 00:03:36.403 06:13:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.403 06:13:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.403 06:13:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:36.403 06:13:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:36.403 06:13:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.403 06:13:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.403 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.403 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.403 06:13:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42642700 kB' 'MemAvailable: 44223684 kB' 'Buffers: 4384 kB' 'Cached: 9641156 kB' 'SwapCached: 76 kB' 'Active: 6677544 kB' 'Inactive: 3555396 kB' 'Active(anon): 5765044 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591176 kB' 'Mapped: 177496 kB' 'Shmem: 7894008 kB' 'KReclaimable: 567212 kB' 'Slab: 1558160 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 990948 kB' 'KernelStack: 22096 kB' 'PageTables: 8992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10051256 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218308 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:36.403 06:13:05 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.403 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.403 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.403 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.403 06:13:05 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.403 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.403 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.403 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.403 06:13:05 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.403 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.403 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.404 06:13:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.404 06:13:05 -- setup/common.sh@33 -- # echo 0 00:03:36.404 06:13:05 -- setup/common.sh@33 -- # return 0 00:03:36.404 06:13:05 -- setup/hugepages.sh@97 -- # anon=0 00:03:36.404 06:13:05 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:36.404 06:13:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:36.404 06:13:05 -- setup/common.sh@18 -- # local node= 00:03:36.404 06:13:05 -- setup/common.sh@19 -- # local var val 00:03:36.404 06:13:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.404 06:13:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.404 06:13:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:36.404 06:13:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:36.404 06:13:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.404 06:13:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.404 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42642640 kB' 'MemAvailable: 44223624 kB' 'Buffers: 4384 kB' 'Cached: 9641160 kB' 'SwapCached: 76 kB' 'Active: 6678620 kB' 'Inactive: 3555396 kB' 'Active(anon): 5766120 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591768 kB' 'Mapped: 177624 kB' 'Shmem: 7894012 kB' 'KReclaimable: 567212 kB' 'Slab: 1558196 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 990984 kB' 'KernelStack: 22144 kB' 'PageTables: 8948 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10051268 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218308 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.405 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.405 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.406 06:13:05 -- setup/common.sh@33 -- # echo 0 00:03:36.406 06:13:05 -- setup/common.sh@33 -- # return 0 00:03:36.406 06:13:05 -- setup/hugepages.sh@99 -- # surp=0 00:03:36.406 06:13:05 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:36.406 06:13:05 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:36.406 06:13:05 -- setup/common.sh@18 -- # local node= 00:03:36.406 06:13:05 -- setup/common.sh@19 -- # local var val 00:03:36.406 06:13:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.406 06:13:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.406 06:13:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:36.406 06:13:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:36.406 06:13:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.406 06:13:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42641616 kB' 'MemAvailable: 44222600 kB' 'Buffers: 4384 kB' 'Cached: 9641172 kB' 'SwapCached: 76 kB' 'Active: 6677964 kB' 'Inactive: 3555396 kB' 'Active(anon): 5765464 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591024 kB' 'Mapped: 177544 kB' 'Shmem: 7894024 kB' 'KReclaimable: 567212 kB' 'Slab: 1558228 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991016 kB' 'KernelStack: 22032 kB' 'PageTables: 9112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10051284 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218228 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.406 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.406 06:13:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.407 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.407 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.407 06:13:05 -- setup/common.sh@33 -- # echo 0 00:03:36.407 06:13:05 -- setup/common.sh@33 -- # return 0 00:03:36.407 06:13:05 -- setup/hugepages.sh@100 -- # resv=0 00:03:36.407 06:13:05 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:36.407 nr_hugepages=1536 00:03:36.407 06:13:05 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:36.407 resv_hugepages=0 00:03:36.407 06:13:05 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:36.407 surplus_hugepages=0 00:03:36.407 06:13:05 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:36.407 anon_hugepages=0 00:03:36.407 06:13:05 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:36.407 06:13:05 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:36.408 06:13:05 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:36.408 06:13:05 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:36.408 06:13:05 -- setup/common.sh@18 -- # local node= 00:03:36.408 06:13:05 -- setup/common.sh@19 -- # local var val 00:03:36.408 06:13:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.408 06:13:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.408 06:13:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:36.408 06:13:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:36.408 06:13:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.408 06:13:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42643580 kB' 'MemAvailable: 44224564 kB' 'Buffers: 4384 kB' 'Cached: 9641188 kB' 'SwapCached: 76 kB' 'Active: 6677592 kB' 'Inactive: 3555396 kB' 'Active(anon): 5765092 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590104 kB' 'Mapped: 177544 kB' 'Shmem: 7894040 kB' 'KReclaimable: 567212 kB' 'Slab: 1558228 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991016 kB' 'KernelStack: 21808 kB' 'PageTables: 8624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10046752 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218116 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.408 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.408 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.409 06:13:05 -- setup/common.sh@33 -- # echo 1536 00:03:36.409 06:13:05 -- setup/common.sh@33 -- # return 0 00:03:36.409 06:13:05 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:36.409 06:13:05 -- setup/hugepages.sh@112 -- # get_nodes 00:03:36.409 06:13:05 -- setup/hugepages.sh@27 -- # local node 00:03:36.409 06:13:05 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:36.409 06:13:05 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:36.409 06:13:05 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:36.409 06:13:05 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:36.409 06:13:05 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:36.409 06:13:05 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:36.409 06:13:05 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:36.409 06:13:05 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:36.409 06:13:05 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:36.409 06:13:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:36.409 06:13:05 -- setup/common.sh@18 -- # local node=0 00:03:36.409 06:13:05 -- setup/common.sh@19 -- # local var val 00:03:36.409 06:13:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.409 06:13:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.409 06:13:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:36.409 06:13:05 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:36.409 06:13:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.409 06:13:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24591808 kB' 'MemUsed: 8042628 kB' 'SwapCached: 44 kB' 'Active: 4296068 kB' 'Inactive: 532564 kB' 'Active(anon): 3518640 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4590908 kB' 'Mapped: 118176 kB' 'AnonPages: 240892 kB' 'Shmem: 3280928 kB' 'KernelStack: 10888 kB' 'PageTables: 4676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 394740 kB' 'Slab: 873148 kB' 'SReclaimable: 394740 kB' 'SUnreclaim: 478408 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.409 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.409 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@33 -- # echo 0 00:03:36.410 06:13:05 -- setup/common.sh@33 -- # return 0 00:03:36.410 06:13:05 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:36.410 06:13:05 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:36.410 06:13:05 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:36.410 06:13:05 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:36.410 06:13:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:36.410 06:13:05 -- setup/common.sh@18 -- # local node=1 00:03:36.410 06:13:05 -- setup/common.sh@19 -- # local var val 00:03:36.410 06:13:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.410 06:13:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.410 06:13:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:36.410 06:13:05 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:36.410 06:13:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.410 06:13:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18052080 kB' 'MemUsed: 9597280 kB' 'SwapCached: 32 kB' 'Active: 2380496 kB' 'Inactive: 3022832 kB' 'Active(anon): 2245424 kB' 'Inactive(anon): 2716308 kB' 'Active(file): 135072 kB' 'Inactive(file): 306524 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5054764 kB' 'Mapped: 59368 kB' 'AnonPages: 348700 kB' 'Shmem: 4613136 kB' 'KernelStack: 10936 kB' 'PageTables: 3832 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 172472 kB' 'Slab: 685240 kB' 'SReclaimable: 172472 kB' 'SUnreclaim: 512768 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.410 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.410 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # continue 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.411 06:13:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.411 06:13:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.411 06:13:05 -- setup/common.sh@33 -- # echo 0 00:03:36.411 06:13:05 -- setup/common.sh@33 -- # return 0 00:03:36.411 06:13:05 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:36.411 06:13:05 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:36.411 06:13:05 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:36.411 06:13:05 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:36.411 06:13:05 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:36.411 node0=512 expecting 512 00:03:36.411 06:13:05 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:36.411 06:13:05 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:36.411 06:13:05 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:36.411 06:13:05 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:36.411 node1=1024 expecting 1024 00:03:36.411 06:13:05 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:36.411 00:03:36.411 real 0m3.301s 00:03:36.411 user 0m1.149s 00:03:36.411 sys 0m2.083s 00:03:36.411 06:13:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:36.411 06:13:05 -- common/autotest_common.sh@10 -- # set +x 00:03:36.411 ************************************ 00:03:36.411 END TEST custom_alloc 00:03:36.411 ************************************ 00:03:36.411 06:13:05 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:36.411 06:13:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:36.411 06:13:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:36.411 06:13:05 -- common/autotest_common.sh@10 -- # set +x 00:03:36.411 ************************************ 00:03:36.411 START TEST no_shrink_alloc 00:03:36.411 ************************************ 00:03:36.411 06:13:05 -- common/autotest_common.sh@1114 -- # no_shrink_alloc 00:03:36.412 06:13:05 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:36.412 06:13:05 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:36.412 06:13:05 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:36.412 06:13:05 -- setup/hugepages.sh@51 -- # shift 00:03:36.412 06:13:05 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:36.412 06:13:05 -- setup/hugepages.sh@52 -- # local node_ids 00:03:36.412 06:13:05 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:36.412 06:13:05 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:36.412 06:13:05 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:36.412 06:13:05 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:36.412 06:13:05 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:36.412 06:13:05 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:36.412 06:13:05 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:36.412 06:13:05 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:36.412 06:13:05 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:36.412 06:13:05 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:36.412 06:13:05 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:36.412 06:13:05 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:36.412 06:13:05 -- setup/hugepages.sh@73 -- # return 0 00:03:36.412 06:13:05 -- setup/hugepages.sh@198 -- # setup output 00:03:36.412 06:13:05 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:36.412 06:13:05 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:39.709 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:39.709 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:39.709 06:13:09 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:39.709 06:13:09 -- setup/hugepages.sh@89 -- # local node 00:03:39.710 06:13:09 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:39.710 06:13:09 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:39.710 06:13:09 -- setup/hugepages.sh@92 -- # local surp 00:03:39.710 06:13:09 -- setup/hugepages.sh@93 -- # local resv 00:03:39.710 06:13:09 -- setup/hugepages.sh@94 -- # local anon 00:03:39.710 06:13:09 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:39.710 06:13:09 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:39.710 06:13:09 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:39.710 06:13:09 -- setup/common.sh@18 -- # local node= 00:03:39.710 06:13:09 -- setup/common.sh@19 -- # local var val 00:03:39.710 06:13:09 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.710 06:13:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.710 06:13:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.710 06:13:09 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.710 06:13:09 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.710 06:13:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43682904 kB' 'MemAvailable: 45263888 kB' 'Buffers: 4384 kB' 'Cached: 9641280 kB' 'SwapCached: 76 kB' 'Active: 6678304 kB' 'Inactive: 3555396 kB' 'Active(anon): 5765804 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590840 kB' 'Mapped: 177680 kB' 'Shmem: 7894132 kB' 'KReclaimable: 567212 kB' 'Slab: 1558624 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991412 kB' 'KernelStack: 21872 kB' 'PageTables: 8684 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10047364 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.710 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.710 06:13:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.711 06:13:09 -- setup/common.sh@33 -- # echo 0 00:03:39.711 06:13:09 -- setup/common.sh@33 -- # return 0 00:03:39.711 06:13:09 -- setup/hugepages.sh@97 -- # anon=0 00:03:39.711 06:13:09 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:39.711 06:13:09 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.711 06:13:09 -- setup/common.sh@18 -- # local node= 00:03:39.711 06:13:09 -- setup/common.sh@19 -- # local var val 00:03:39.711 06:13:09 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.711 06:13:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.711 06:13:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.711 06:13:09 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.711 06:13:09 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.711 06:13:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43682372 kB' 'MemAvailable: 45263356 kB' 'Buffers: 4384 kB' 'Cached: 9641284 kB' 'SwapCached: 76 kB' 'Active: 6677572 kB' 'Inactive: 3555396 kB' 'Active(anon): 5765072 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590552 kB' 'Mapped: 177560 kB' 'Shmem: 7894136 kB' 'KReclaimable: 567212 kB' 'Slab: 1558588 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991376 kB' 'KernelStack: 21840 kB' 'PageTables: 8560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10047376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.711 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.711 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.712 06:13:09 -- setup/common.sh@33 -- # echo 0 00:03:39.712 06:13:09 -- setup/common.sh@33 -- # return 0 00:03:39.712 06:13:09 -- setup/hugepages.sh@99 -- # surp=0 00:03:39.712 06:13:09 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:39.712 06:13:09 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:39.712 06:13:09 -- setup/common.sh@18 -- # local node= 00:03:39.712 06:13:09 -- setup/common.sh@19 -- # local var val 00:03:39.712 06:13:09 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.712 06:13:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.712 06:13:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.712 06:13:09 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.712 06:13:09 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.712 06:13:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43683100 kB' 'MemAvailable: 45264084 kB' 'Buffers: 4384 kB' 'Cached: 9641296 kB' 'SwapCached: 76 kB' 'Active: 6677608 kB' 'Inactive: 3555396 kB' 'Active(anon): 5765108 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590556 kB' 'Mapped: 177560 kB' 'Shmem: 7894148 kB' 'KReclaimable: 567212 kB' 'Slab: 1558588 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991376 kB' 'KernelStack: 21840 kB' 'PageTables: 8560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10047388 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.712 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.712 06:13:09 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.713 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.713 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.714 06:13:09 -- setup/common.sh@33 -- # echo 0 00:03:39.714 06:13:09 -- setup/common.sh@33 -- # return 0 00:03:39.714 06:13:09 -- setup/hugepages.sh@100 -- # resv=0 00:03:39.714 06:13:09 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:39.714 nr_hugepages=1024 00:03:39.714 06:13:09 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:39.714 resv_hugepages=0 00:03:39.714 06:13:09 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:39.714 surplus_hugepages=0 00:03:39.714 06:13:09 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:39.714 anon_hugepages=0 00:03:39.714 06:13:09 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.714 06:13:09 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:39.714 06:13:09 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:39.714 06:13:09 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:39.714 06:13:09 -- setup/common.sh@18 -- # local node= 00:03:39.714 06:13:09 -- setup/common.sh@19 -- # local var val 00:03:39.714 06:13:09 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.714 06:13:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.714 06:13:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.714 06:13:09 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.714 06:13:09 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.714 06:13:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43681336 kB' 'MemAvailable: 45262320 kB' 'Buffers: 4384 kB' 'Cached: 9641308 kB' 'SwapCached: 76 kB' 'Active: 6678812 kB' 'Inactive: 3555396 kB' 'Active(anon): 5766312 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591760 kB' 'Mapped: 178064 kB' 'Shmem: 7894160 kB' 'KReclaimable: 567212 kB' 'Slab: 1558588 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991376 kB' 'KernelStack: 21824 kB' 'PageTables: 8512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10049288 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.714 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.714 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.715 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.715 06:13:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.715 06:13:09 -- setup/common.sh@33 -- # echo 1024 00:03:39.715 06:13:09 -- setup/common.sh@33 -- # return 0 00:03:39.973 06:13:09 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.973 06:13:09 -- setup/hugepages.sh@112 -- # get_nodes 00:03:39.973 06:13:09 -- setup/hugepages.sh@27 -- # local node 00:03:39.973 06:13:09 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.973 06:13:09 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:39.973 06:13:09 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.973 06:13:09 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:39.973 06:13:09 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:39.973 06:13:09 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:39.973 06:13:09 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:39.973 06:13:09 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:39.973 06:13:09 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:39.973 06:13:09 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.973 06:13:09 -- setup/common.sh@18 -- # local node=0 00:03:39.973 06:13:09 -- setup/common.sh@19 -- # local var val 00:03:39.973 06:13:09 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.973 06:13:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.973 06:13:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:39.973 06:13:09 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:39.973 06:13:09 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.973 06:13:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.973 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.973 06:13:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23554892 kB' 'MemUsed: 9079544 kB' 'SwapCached: 44 kB' 'Active: 4297252 kB' 'Inactive: 532564 kB' 'Active(anon): 3519824 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4591000 kB' 'Mapped: 118696 kB' 'AnonPages: 242000 kB' 'Shmem: 3281020 kB' 'KernelStack: 10904 kB' 'PageTables: 4716 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 394740 kB' 'Slab: 872984 kB' 'SReclaimable: 394740 kB' 'SUnreclaim: 478244 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:39.973 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.973 06:13:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.973 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.973 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.973 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.973 06:13:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.973 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.973 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.973 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # continue 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.974 06:13:09 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.974 06:13:09 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.974 06:13:09 -- setup/common.sh@33 -- # echo 0 00:03:39.974 06:13:09 -- setup/common.sh@33 -- # return 0 00:03:39.974 06:13:09 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:39.974 06:13:09 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:39.974 06:13:09 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:39.974 06:13:09 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:39.974 06:13:09 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:39.974 node0=1024 expecting 1024 00:03:39.974 06:13:09 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:39.974 06:13:09 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:39.974 06:13:09 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:39.974 06:13:09 -- setup/hugepages.sh@202 -- # setup output 00:03:39.974 06:13:09 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:39.974 06:13:09 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:43.267 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:43.267 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:43.267 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:43.267 06:13:12 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:43.267 06:13:12 -- setup/hugepages.sh@89 -- # local node 00:03:43.267 06:13:12 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:43.267 06:13:12 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:43.267 06:13:12 -- setup/hugepages.sh@92 -- # local surp 00:03:43.267 06:13:12 -- setup/hugepages.sh@93 -- # local resv 00:03:43.267 06:13:12 -- setup/hugepages.sh@94 -- # local anon 00:03:43.267 06:13:12 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:43.267 06:13:12 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:43.267 06:13:12 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:43.267 06:13:12 -- setup/common.sh@18 -- # local node= 00:03:43.267 06:13:12 -- setup/common.sh@19 -- # local var val 00:03:43.267 06:13:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:43.267 06:13:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:43.267 06:13:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:43.267 06:13:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:43.267 06:13:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:43.267 06:13:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:43.267 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.267 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.267 06:13:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43681860 kB' 'MemAvailable: 45262844 kB' 'Buffers: 4384 kB' 'Cached: 9641396 kB' 'SwapCached: 76 kB' 'Active: 6678612 kB' 'Inactive: 3555396 kB' 'Active(anon): 5766112 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591432 kB' 'Mapped: 177668 kB' 'Shmem: 7894248 kB' 'KReclaimable: 567212 kB' 'Slab: 1559048 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991836 kB' 'KernelStack: 21840 kB' 'PageTables: 8584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10048004 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218116 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:43.267 06:13:12 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.267 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.268 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.268 06:13:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:43.268 06:13:12 -- setup/common.sh@33 -- # echo 0 00:03:43.268 06:13:12 -- setup/common.sh@33 -- # return 0 00:03:43.268 06:13:12 -- setup/hugepages.sh@97 -- # anon=0 00:03:43.268 06:13:12 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:43.268 06:13:12 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:43.268 06:13:12 -- setup/common.sh@18 -- # local node= 00:03:43.269 06:13:12 -- setup/common.sh@19 -- # local var val 00:03:43.269 06:13:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:43.269 06:13:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:43.269 06:13:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:43.269 06:13:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:43.269 06:13:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:43.269 06:13:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43683104 kB' 'MemAvailable: 45264088 kB' 'Buffers: 4384 kB' 'Cached: 9641400 kB' 'SwapCached: 76 kB' 'Active: 6678852 kB' 'Inactive: 3555396 kB' 'Active(anon): 5766352 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591676 kB' 'Mapped: 177564 kB' 'Shmem: 7894252 kB' 'KReclaimable: 567212 kB' 'Slab: 1559024 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991812 kB' 'KernelStack: 21840 kB' 'PageTables: 8568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10048016 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.269 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.269 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.270 06:13:12 -- setup/common.sh@33 -- # echo 0 00:03:43.270 06:13:12 -- setup/common.sh@33 -- # return 0 00:03:43.270 06:13:12 -- setup/hugepages.sh@99 -- # surp=0 00:03:43.270 06:13:12 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:43.270 06:13:12 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:43.270 06:13:12 -- setup/common.sh@18 -- # local node= 00:03:43.270 06:13:12 -- setup/common.sh@19 -- # local var val 00:03:43.270 06:13:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:43.270 06:13:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:43.270 06:13:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:43.270 06:13:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:43.270 06:13:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:43.270 06:13:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43683916 kB' 'MemAvailable: 45264900 kB' 'Buffers: 4384 kB' 'Cached: 9641412 kB' 'SwapCached: 76 kB' 'Active: 6681300 kB' 'Inactive: 3555396 kB' 'Active(anon): 5768800 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 594124 kB' 'Mapped: 178064 kB' 'Shmem: 7894264 kB' 'KReclaimable: 567212 kB' 'Slab: 1559024 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991812 kB' 'KernelStack: 21824 kB' 'PageTables: 8512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10051248 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.270 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.270 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.271 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.271 06:13:12 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:43.271 06:13:12 -- setup/common.sh@33 -- # echo 0 00:03:43.271 06:13:12 -- setup/common.sh@33 -- # return 0 00:03:43.271 06:13:12 -- setup/hugepages.sh@100 -- # resv=0 00:03:43.271 06:13:12 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:43.271 nr_hugepages=1024 00:03:43.271 06:13:12 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:43.271 resv_hugepages=0 00:03:43.271 06:13:12 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:43.271 surplus_hugepages=0 00:03:43.271 06:13:12 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:43.271 anon_hugepages=0 00:03:43.271 06:13:12 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:43.271 06:13:12 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:43.271 06:13:12 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:43.271 06:13:12 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:43.271 06:13:12 -- setup/common.sh@18 -- # local node= 00:03:43.271 06:13:12 -- setup/common.sh@19 -- # local var val 00:03:43.271 06:13:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:43.271 06:13:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:43.271 06:13:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:43.271 06:13:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:43.271 06:13:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:43.272 06:13:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43686000 kB' 'MemAvailable: 45266984 kB' 'Buffers: 4384 kB' 'Cached: 9641428 kB' 'SwapCached: 76 kB' 'Active: 6678920 kB' 'Inactive: 3555396 kB' 'Active(anon): 5766420 kB' 'Inactive(anon): 2716364 kB' 'Active(file): 912500 kB' 'Inactive(file): 839032 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8056828 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591824 kB' 'Mapped: 178048 kB' 'Shmem: 7894280 kB' 'KReclaimable: 567212 kB' 'Slab: 1559024 kB' 'SReclaimable: 567212 kB' 'SUnreclaim: 991812 kB' 'KernelStack: 21840 kB' 'PageTables: 8612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10076304 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218052 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.272 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.272 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:43.273 06:13:12 -- setup/common.sh@33 -- # echo 1024 00:03:43.273 06:13:12 -- setup/common.sh@33 -- # return 0 00:03:43.273 06:13:12 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:43.273 06:13:12 -- setup/hugepages.sh@112 -- # get_nodes 00:03:43.273 06:13:12 -- setup/hugepages.sh@27 -- # local node 00:03:43.273 06:13:12 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:43.273 06:13:12 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:43.273 06:13:12 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:43.273 06:13:12 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:43.273 06:13:12 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:43.273 06:13:12 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:43.273 06:13:12 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:43.273 06:13:12 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:43.273 06:13:12 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:43.273 06:13:12 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:43.273 06:13:12 -- setup/common.sh@18 -- # local node=0 00:03:43.273 06:13:12 -- setup/common.sh@19 -- # local var val 00:03:43.273 06:13:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:43.273 06:13:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:43.273 06:13:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:43.273 06:13:12 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:43.273 06:13:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:43.273 06:13:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23570604 kB' 'MemUsed: 9063832 kB' 'SwapCached: 44 kB' 'Active: 4297360 kB' 'Inactive: 532564 kB' 'Active(anon): 3519932 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4591088 kB' 'Mapped: 118696 kB' 'AnonPages: 242080 kB' 'Shmem: 3281108 kB' 'KernelStack: 10888 kB' 'PageTables: 4668 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 394740 kB' 'Slab: 873268 kB' 'SReclaimable: 394740 kB' 'SUnreclaim: 478528 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.273 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.273 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # continue 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:43.274 06:13:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:43.274 06:13:12 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.274 06:13:12 -- setup/common.sh@33 -- # echo 0 00:03:43.274 06:13:12 -- setup/common.sh@33 -- # return 0 00:03:43.274 06:13:12 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:43.274 06:13:12 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:43.274 06:13:12 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:43.274 06:13:12 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:43.274 06:13:12 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:43.274 node0=1024 expecting 1024 00:03:43.274 06:13:12 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:43.274 00:03:43.274 real 0m6.998s 00:03:43.274 user 0m2.686s 00:03:43.274 sys 0m4.437s 00:03:43.274 06:13:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:43.274 06:13:12 -- common/autotest_common.sh@10 -- # set +x 00:03:43.274 ************************************ 00:03:43.274 END TEST no_shrink_alloc 00:03:43.274 ************************************ 00:03:43.535 06:13:12 -- setup/hugepages.sh@217 -- # clear_hp 00:03:43.535 06:13:12 -- setup/hugepages.sh@37 -- # local node hp 00:03:43.535 06:13:12 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:43.535 06:13:12 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:43.535 06:13:12 -- setup/hugepages.sh@41 -- # echo 0 00:03:43.535 06:13:12 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:43.535 06:13:12 -- setup/hugepages.sh@41 -- # echo 0 00:03:43.535 06:13:12 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:43.535 06:13:12 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:43.535 06:13:12 -- setup/hugepages.sh@41 -- # echo 0 00:03:43.535 06:13:12 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:43.535 06:13:12 -- setup/hugepages.sh@41 -- # echo 0 00:03:43.535 06:13:12 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:43.535 06:13:12 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:43.535 00:03:43.535 real 0m26.838s 00:03:43.535 user 0m9.381s 00:03:43.535 sys 0m16.210s 00:03:43.535 06:13:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:43.535 06:13:12 -- common/autotest_common.sh@10 -- # set +x 00:03:43.535 ************************************ 00:03:43.535 END TEST hugepages 00:03:43.535 ************************************ 00:03:43.535 06:13:12 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:43.535 06:13:12 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:43.535 06:13:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:43.535 06:13:12 -- common/autotest_common.sh@10 -- # set +x 00:03:43.535 ************************************ 00:03:43.535 START TEST driver 00:03:43.535 ************************************ 00:03:43.535 06:13:12 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:43.535 * Looking for test storage... 00:03:43.535 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:43.535 06:13:12 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:43.535 06:13:12 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:43.535 06:13:12 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:43.535 06:13:13 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:43.535 06:13:13 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:43.535 06:13:13 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:43.535 06:13:13 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:43.535 06:13:13 -- scripts/common.sh@335 -- # IFS=.-: 00:03:43.535 06:13:13 -- scripts/common.sh@335 -- # read -ra ver1 00:03:43.535 06:13:13 -- scripts/common.sh@336 -- # IFS=.-: 00:03:43.535 06:13:13 -- scripts/common.sh@336 -- # read -ra ver2 00:03:43.535 06:13:13 -- scripts/common.sh@337 -- # local 'op=<' 00:03:43.535 06:13:13 -- scripts/common.sh@339 -- # ver1_l=2 00:03:43.535 06:13:13 -- scripts/common.sh@340 -- # ver2_l=1 00:03:43.535 06:13:13 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:43.535 06:13:13 -- scripts/common.sh@343 -- # case "$op" in 00:03:43.535 06:13:13 -- scripts/common.sh@344 -- # : 1 00:03:43.535 06:13:13 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:43.535 06:13:13 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:43.535 06:13:13 -- scripts/common.sh@364 -- # decimal 1 00:03:43.535 06:13:13 -- scripts/common.sh@352 -- # local d=1 00:03:43.535 06:13:13 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:43.535 06:13:13 -- scripts/common.sh@354 -- # echo 1 00:03:43.535 06:13:13 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:43.535 06:13:13 -- scripts/common.sh@365 -- # decimal 2 00:03:43.535 06:13:13 -- scripts/common.sh@352 -- # local d=2 00:03:43.535 06:13:13 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:43.535 06:13:13 -- scripts/common.sh@354 -- # echo 2 00:03:43.535 06:13:13 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:43.535 06:13:13 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:43.535 06:13:13 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:43.535 06:13:13 -- scripts/common.sh@367 -- # return 0 00:03:43.535 06:13:13 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:43.535 06:13:13 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:43.535 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:43.535 --rc genhtml_branch_coverage=1 00:03:43.535 --rc genhtml_function_coverage=1 00:03:43.535 --rc genhtml_legend=1 00:03:43.535 --rc geninfo_all_blocks=1 00:03:43.535 --rc geninfo_unexecuted_blocks=1 00:03:43.535 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:43.535 ' 00:03:43.535 06:13:13 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:43.535 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:43.535 --rc genhtml_branch_coverage=1 00:03:43.535 --rc genhtml_function_coverage=1 00:03:43.535 --rc genhtml_legend=1 00:03:43.535 --rc geninfo_all_blocks=1 00:03:43.535 --rc geninfo_unexecuted_blocks=1 00:03:43.535 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:43.535 ' 00:03:43.535 06:13:13 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:43.535 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:43.535 --rc genhtml_branch_coverage=1 00:03:43.535 --rc genhtml_function_coverage=1 00:03:43.535 --rc genhtml_legend=1 00:03:43.535 --rc geninfo_all_blocks=1 00:03:43.535 --rc geninfo_unexecuted_blocks=1 00:03:43.535 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:43.535 ' 00:03:43.535 06:13:13 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:43.535 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:43.535 --rc genhtml_branch_coverage=1 00:03:43.535 --rc genhtml_function_coverage=1 00:03:43.535 --rc genhtml_legend=1 00:03:43.535 --rc geninfo_all_blocks=1 00:03:43.535 --rc geninfo_unexecuted_blocks=1 00:03:43.535 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:43.535 ' 00:03:43.535 06:13:13 -- setup/driver.sh@68 -- # setup reset 00:03:43.535 06:13:13 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:43.535 06:13:13 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:48.816 06:13:17 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:03:48.816 06:13:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:48.816 06:13:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:48.816 06:13:17 -- common/autotest_common.sh@10 -- # set +x 00:03:48.816 ************************************ 00:03:48.816 START TEST guess_driver 00:03:48.816 ************************************ 00:03:48.816 06:13:17 -- common/autotest_common.sh@1114 -- # guess_driver 00:03:48.816 06:13:17 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:03:48.816 06:13:17 -- setup/driver.sh@47 -- # local fail=0 00:03:48.816 06:13:17 -- setup/driver.sh@49 -- # pick_driver 00:03:48.816 06:13:17 -- setup/driver.sh@36 -- # vfio 00:03:48.816 06:13:17 -- setup/driver.sh@21 -- # local iommu_grups 00:03:48.816 06:13:17 -- setup/driver.sh@22 -- # local unsafe_vfio 00:03:48.816 06:13:17 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:03:48.816 06:13:17 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:03:48.816 06:13:17 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:03:48.816 06:13:17 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:03:48.816 06:13:17 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:03:48.816 06:13:17 -- setup/driver.sh@14 -- # mod vfio_pci 00:03:48.816 06:13:17 -- setup/driver.sh@12 -- # dep vfio_pci 00:03:48.816 06:13:17 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:03:48.816 06:13:17 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:03:48.816 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:48.816 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:48.816 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:48.816 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:48.816 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:03:48.816 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:03:48.816 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:03:48.816 06:13:17 -- setup/driver.sh@30 -- # return 0 00:03:48.816 06:13:17 -- setup/driver.sh@37 -- # echo vfio-pci 00:03:48.816 06:13:17 -- setup/driver.sh@49 -- # driver=vfio-pci 00:03:48.816 06:13:17 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:03:48.816 06:13:17 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:03:48.816 Looking for driver=vfio-pci 00:03:48.816 06:13:17 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:48.816 06:13:17 -- setup/driver.sh@45 -- # setup output config 00:03:48.816 06:13:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:48.816 06:13:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:52.106 06:13:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:21 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:21 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:21 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:21 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:21 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:21 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:21 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:21 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:21 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:21 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:21 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:21 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:21 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:21 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:21 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:21 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:21 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:21 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:21 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:21 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:21 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:21 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:21 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:21 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:21 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:21 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:21 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:21 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:21 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:21 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:52.106 06:13:21 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:52.106 06:13:21 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:52.106 06:13:21 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:53.488 06:13:22 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:53.488 06:13:22 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:53.488 06:13:22 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:53.488 06:13:22 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:03:53.488 06:13:22 -- setup/driver.sh@65 -- # setup reset 00:03:53.488 06:13:22 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:53.488 06:13:22 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:58.766 00:03:58.766 real 0m9.880s 00:03:58.766 user 0m2.526s 00:03:58.766 sys 0m5.058s 00:03:58.766 06:13:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:58.766 06:13:27 -- common/autotest_common.sh@10 -- # set +x 00:03:58.766 ************************************ 00:03:58.766 END TEST guess_driver 00:03:58.766 ************************************ 00:03:58.766 00:03:58.766 real 0m14.751s 00:03:58.766 user 0m3.932s 00:03:58.766 sys 0m7.769s 00:03:58.766 06:13:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:58.766 06:13:27 -- common/autotest_common.sh@10 -- # set +x 00:03:58.766 ************************************ 00:03:58.766 END TEST driver 00:03:58.766 ************************************ 00:03:58.766 06:13:27 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:03:58.766 06:13:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:58.766 06:13:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:58.766 06:13:27 -- common/autotest_common.sh@10 -- # set +x 00:03:58.766 ************************************ 00:03:58.766 START TEST devices 00:03:58.766 ************************************ 00:03:58.766 06:13:27 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:03:58.766 * Looking for test storage... 00:03:58.766 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:58.766 06:13:27 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:58.766 06:13:27 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:58.766 06:13:27 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:58.766 06:13:27 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:58.766 06:13:27 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:58.766 06:13:27 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:58.766 06:13:27 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:58.766 06:13:27 -- scripts/common.sh@335 -- # IFS=.-: 00:03:58.766 06:13:27 -- scripts/common.sh@335 -- # read -ra ver1 00:03:58.766 06:13:27 -- scripts/common.sh@336 -- # IFS=.-: 00:03:58.766 06:13:27 -- scripts/common.sh@336 -- # read -ra ver2 00:03:58.766 06:13:27 -- scripts/common.sh@337 -- # local 'op=<' 00:03:58.766 06:13:27 -- scripts/common.sh@339 -- # ver1_l=2 00:03:58.766 06:13:27 -- scripts/common.sh@340 -- # ver2_l=1 00:03:58.766 06:13:27 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:58.766 06:13:27 -- scripts/common.sh@343 -- # case "$op" in 00:03:58.766 06:13:27 -- scripts/common.sh@344 -- # : 1 00:03:58.766 06:13:27 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:58.766 06:13:27 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:58.766 06:13:27 -- scripts/common.sh@364 -- # decimal 1 00:03:58.766 06:13:27 -- scripts/common.sh@352 -- # local d=1 00:03:58.766 06:13:27 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:58.766 06:13:27 -- scripts/common.sh@354 -- # echo 1 00:03:58.766 06:13:27 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:58.766 06:13:27 -- scripts/common.sh@365 -- # decimal 2 00:03:58.766 06:13:27 -- scripts/common.sh@352 -- # local d=2 00:03:58.766 06:13:27 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:58.766 06:13:27 -- scripts/common.sh@354 -- # echo 2 00:03:58.766 06:13:27 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:58.766 06:13:27 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:58.766 06:13:27 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:58.766 06:13:27 -- scripts/common.sh@367 -- # return 0 00:03:58.766 06:13:27 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:58.766 06:13:27 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:58.766 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.766 --rc genhtml_branch_coverage=1 00:03:58.766 --rc genhtml_function_coverage=1 00:03:58.766 --rc genhtml_legend=1 00:03:58.766 --rc geninfo_all_blocks=1 00:03:58.766 --rc geninfo_unexecuted_blocks=1 00:03:58.766 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.766 ' 00:03:58.766 06:13:27 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:58.766 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.766 --rc genhtml_branch_coverage=1 00:03:58.766 --rc genhtml_function_coverage=1 00:03:58.766 --rc genhtml_legend=1 00:03:58.766 --rc geninfo_all_blocks=1 00:03:58.766 --rc geninfo_unexecuted_blocks=1 00:03:58.766 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.766 ' 00:03:58.766 06:13:27 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:58.766 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.766 --rc genhtml_branch_coverage=1 00:03:58.766 --rc genhtml_function_coverage=1 00:03:58.766 --rc genhtml_legend=1 00:03:58.766 --rc geninfo_all_blocks=1 00:03:58.766 --rc geninfo_unexecuted_blocks=1 00:03:58.766 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.766 ' 00:03:58.766 06:13:27 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:58.766 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.766 --rc genhtml_branch_coverage=1 00:03:58.766 --rc genhtml_function_coverage=1 00:03:58.766 --rc genhtml_legend=1 00:03:58.766 --rc geninfo_all_blocks=1 00:03:58.766 --rc geninfo_unexecuted_blocks=1 00:03:58.766 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.766 ' 00:03:58.766 06:13:27 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:03:58.766 06:13:27 -- setup/devices.sh@192 -- # setup reset 00:03:58.766 06:13:27 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:58.766 06:13:27 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:02.059 06:13:31 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:02.059 06:13:31 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:04:02.059 06:13:31 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:04:02.059 06:13:31 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:04:02.059 06:13:31 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:04:02.059 06:13:31 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:04:02.059 06:13:31 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:04:02.059 06:13:31 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:02.059 06:13:31 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:04:02.059 06:13:31 -- setup/devices.sh@196 -- # blocks=() 00:04:02.059 06:13:31 -- setup/devices.sh@196 -- # declare -a blocks 00:04:02.059 06:13:31 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:02.059 06:13:31 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:02.059 06:13:31 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:02.059 06:13:31 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:02.059 06:13:31 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:02.059 06:13:31 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:02.059 06:13:31 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:02.059 06:13:31 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:02.059 06:13:31 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:02.059 06:13:31 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:02.059 06:13:31 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:02.319 No valid GPT data, bailing 00:04:02.319 06:13:31 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:02.319 06:13:31 -- scripts/common.sh@393 -- # pt= 00:04:02.319 06:13:31 -- scripts/common.sh@394 -- # return 1 00:04:02.319 06:13:31 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:02.319 06:13:31 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:02.319 06:13:31 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:02.319 06:13:31 -- setup/common.sh@80 -- # echo 1600321314816 00:04:02.319 06:13:31 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:02.319 06:13:31 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:02.319 06:13:31 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:02.319 06:13:31 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:02.319 06:13:31 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:02.319 06:13:31 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:02.319 06:13:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:02.319 06:13:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:02.319 06:13:31 -- common/autotest_common.sh@10 -- # set +x 00:04:02.319 ************************************ 00:04:02.319 START TEST nvme_mount 00:04:02.319 ************************************ 00:04:02.319 06:13:31 -- common/autotest_common.sh@1114 -- # nvme_mount 00:04:02.319 06:13:31 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:02.319 06:13:31 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:02.319 06:13:31 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:02.319 06:13:31 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:02.319 06:13:31 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:02.319 06:13:31 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:02.319 06:13:31 -- setup/common.sh@40 -- # local part_no=1 00:04:02.319 06:13:31 -- setup/common.sh@41 -- # local size=1073741824 00:04:02.319 06:13:31 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:02.319 06:13:31 -- setup/common.sh@44 -- # parts=() 00:04:02.319 06:13:31 -- setup/common.sh@44 -- # local parts 00:04:02.319 06:13:31 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:02.319 06:13:31 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:02.319 06:13:31 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:02.319 06:13:31 -- setup/common.sh@46 -- # (( part++ )) 00:04:02.319 06:13:31 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:02.319 06:13:31 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:02.319 06:13:31 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:02.319 06:13:31 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:03.256 Creating new GPT entries in memory. 00:04:03.256 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:03.256 other utilities. 00:04:03.256 06:13:32 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:03.256 06:13:32 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:03.256 06:13:32 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:03.256 06:13:32 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:03.256 06:13:32 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:04.193 Creating new GPT entries in memory. 00:04:04.193 The operation has completed successfully. 00:04:04.193 06:13:33 -- setup/common.sh@57 -- # (( part++ )) 00:04:04.193 06:13:33 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:04.193 06:13:33 -- setup/common.sh@62 -- # wait 4181725 00:04:04.454 06:13:33 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:04.454 06:13:33 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:04.454 06:13:33 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:04.454 06:13:33 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:04.454 06:13:33 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:04.454 06:13:33 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:04.454 06:13:33 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:04.454 06:13:33 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:04.454 06:13:33 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:04.454 06:13:33 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:04.454 06:13:33 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:04.454 06:13:33 -- setup/devices.sh@53 -- # local found=0 00:04:04.454 06:13:33 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:04.454 06:13:33 -- setup/devices.sh@56 -- # : 00:04:04.454 06:13:33 -- setup/devices.sh@59 -- # local pci status 00:04:04.454 06:13:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.454 06:13:33 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:04.454 06:13:33 -- setup/devices.sh@47 -- # setup output config 00:04:04.454 06:13:33 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:04.454 06:13:33 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:07.747 06:13:36 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.747 06:13:36 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:07.747 06:13:36 -- setup/devices.sh@63 -- # found=1 00:04:07.748 06:13:36 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:36 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:36 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:36 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:36 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:36 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:36 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:07.748 06:13:37 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:07.748 06:13:37 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:07.748 06:13:37 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:07.748 06:13:37 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:07.748 06:13:37 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:07.748 06:13:37 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:07.748 06:13:37 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:07.748 06:13:37 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:07.748 06:13:37 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:07.748 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:07.748 06:13:37 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:07.748 06:13:37 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:08.007 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:08.007 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:08.007 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:08.007 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:08.007 06:13:37 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:08.007 06:13:37 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:08.007 06:13:37 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:08.007 06:13:37 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:08.007 06:13:37 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:08.266 06:13:37 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:08.266 06:13:37 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:08.266 06:13:37 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:08.266 06:13:37 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:08.266 06:13:37 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:08.266 06:13:37 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:08.266 06:13:37 -- setup/devices.sh@53 -- # local found=0 00:04:08.266 06:13:37 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:08.266 06:13:37 -- setup/devices.sh@56 -- # : 00:04:08.267 06:13:37 -- setup/devices.sh@59 -- # local pci status 00:04:08.267 06:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:08.267 06:13:37 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:08.267 06:13:37 -- setup/devices.sh@47 -- # setup output config 00:04:08.267 06:13:37 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:08.267 06:13:37 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:11.559 06:13:40 -- setup/devices.sh@63 -- # found=1 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:11.559 06:13:40 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:11.559 06:13:40 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.559 06:13:40 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:11.559 06:13:40 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:11.559 06:13:40 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.559 06:13:40 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:11.559 06:13:40 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:11.559 06:13:40 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:11.559 06:13:40 -- setup/devices.sh@50 -- # local mount_point= 00:04:11.559 06:13:40 -- setup/devices.sh@51 -- # local test_file= 00:04:11.559 06:13:40 -- setup/devices.sh@53 -- # local found=0 00:04:11.559 06:13:40 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:11.559 06:13:40 -- setup/devices.sh@59 -- # local pci status 00:04:11.559 06:13:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.559 06:13:40 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:11.559 06:13:40 -- setup/devices.sh@47 -- # setup output config 00:04:11.559 06:13:40 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:11.559 06:13:40 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:14.851 06:13:43 -- setup/devices.sh@63 -- # found=1 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:43 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:14.851 06:13:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.851 06:13:44 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:14.851 06:13:44 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:14.851 06:13:44 -- setup/devices.sh@68 -- # return 0 00:04:14.851 06:13:44 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:14.851 06:13:44 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:14.851 06:13:44 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:14.851 06:13:44 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:14.851 06:13:44 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:14.851 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:14.851 00:04:14.851 real 0m12.415s 00:04:14.851 user 0m3.521s 00:04:14.851 sys 0m6.796s 00:04:14.851 06:13:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:14.851 06:13:44 -- common/autotest_common.sh@10 -- # set +x 00:04:14.851 ************************************ 00:04:14.851 END TEST nvme_mount 00:04:14.851 ************************************ 00:04:14.851 06:13:44 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:14.851 06:13:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:14.851 06:13:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:14.851 06:13:44 -- common/autotest_common.sh@10 -- # set +x 00:04:14.851 ************************************ 00:04:14.851 START TEST dm_mount 00:04:14.851 ************************************ 00:04:14.851 06:13:44 -- common/autotest_common.sh@1114 -- # dm_mount 00:04:14.851 06:13:44 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:14.851 06:13:44 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:14.851 06:13:44 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:14.851 06:13:44 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:14.851 06:13:44 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:14.851 06:13:44 -- setup/common.sh@40 -- # local part_no=2 00:04:14.851 06:13:44 -- setup/common.sh@41 -- # local size=1073741824 00:04:14.851 06:13:44 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:14.851 06:13:44 -- setup/common.sh@44 -- # parts=() 00:04:14.851 06:13:44 -- setup/common.sh@44 -- # local parts 00:04:14.851 06:13:44 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:14.851 06:13:44 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:14.851 06:13:44 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:14.851 06:13:44 -- setup/common.sh@46 -- # (( part++ )) 00:04:14.851 06:13:44 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:14.851 06:13:44 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:14.851 06:13:44 -- setup/common.sh@46 -- # (( part++ )) 00:04:14.851 06:13:44 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:14.851 06:13:44 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:14.851 06:13:44 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:14.851 06:13:44 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:15.788 Creating new GPT entries in memory. 00:04:15.788 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:15.788 other utilities. 00:04:15.788 06:13:45 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:15.788 06:13:45 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:15.788 06:13:45 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:15.788 06:13:45 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:15.788 06:13:45 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:16.727 Creating new GPT entries in memory. 00:04:16.727 The operation has completed successfully. 00:04:16.727 06:13:46 -- setup/common.sh@57 -- # (( part++ )) 00:04:16.727 06:13:46 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:16.727 06:13:46 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:16.727 06:13:46 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:16.727 06:13:46 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:17.665 The operation has completed successfully. 00:04:17.665 06:13:47 -- setup/common.sh@57 -- # (( part++ )) 00:04:17.665 06:13:47 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:17.665 06:13:47 -- setup/common.sh@62 -- # wait 4186227 00:04:17.925 06:13:47 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:17.925 06:13:47 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:17.925 06:13:47 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:17.925 06:13:47 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:17.925 06:13:47 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:17.926 06:13:47 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:17.926 06:13:47 -- setup/devices.sh@161 -- # break 00:04:17.926 06:13:47 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:17.926 06:13:47 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:17.926 06:13:47 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:17.926 06:13:47 -- setup/devices.sh@166 -- # dm=dm-0 00:04:17.926 06:13:47 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:17.926 06:13:47 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:17.926 06:13:47 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:17.926 06:13:47 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:17.926 06:13:47 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:17.926 06:13:47 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:17.926 06:13:47 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:17.926 06:13:47 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:17.926 06:13:47 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:17.926 06:13:47 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:17.926 06:13:47 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:17.926 06:13:47 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:17.926 06:13:47 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:17.926 06:13:47 -- setup/devices.sh@53 -- # local found=0 00:04:17.926 06:13:47 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:17.926 06:13:47 -- setup/devices.sh@56 -- # : 00:04:17.926 06:13:47 -- setup/devices.sh@59 -- # local pci status 00:04:17.926 06:13:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.926 06:13:47 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:17.926 06:13:47 -- setup/devices.sh@47 -- # setup output config 00:04:17.926 06:13:47 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.926 06:13:47 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:21.224 06:13:50 -- setup/devices.sh@63 -- # found=1 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:21.224 06:13:50 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:21.224 06:13:50 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:21.224 06:13:50 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:21.224 06:13:50 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:21.224 06:13:50 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:21.224 06:13:50 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:21.224 06:13:50 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:21.224 06:13:50 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:21.224 06:13:50 -- setup/devices.sh@50 -- # local mount_point= 00:04:21.224 06:13:50 -- setup/devices.sh@51 -- # local test_file= 00:04:21.224 06:13:50 -- setup/devices.sh@53 -- # local found=0 00:04:21.224 06:13:50 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:21.224 06:13:50 -- setup/devices.sh@59 -- # local pci status 00:04:21.224 06:13:50 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.224 06:13:50 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:21.224 06:13:50 -- setup/devices.sh@47 -- # setup output config 00:04:21.224 06:13:50 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.224 06:13:50 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:24.599 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.599 06:13:53 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:24.599 06:13:53 -- setup/devices.sh@63 -- # found=1 00:04:24.599 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.599 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.599 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.599 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.599 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.599 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.599 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.599 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.599 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.599 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.599 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.599 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.599 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.599 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.599 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.600 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.600 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.600 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.600 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.600 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.600 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.600 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.600 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.600 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.600 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.600 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.600 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.600 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.600 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.600 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.600 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.600 06:13:53 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.600 06:13:53 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.600 06:13:53 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:24.600 06:13:53 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:24.600 06:13:53 -- setup/devices.sh@68 -- # return 0 00:04:24.600 06:13:53 -- setup/devices.sh@187 -- # cleanup_dm 00:04:24.600 06:13:53 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:24.600 06:13:53 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:24.600 06:13:53 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:24.600 06:13:54 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:24.600 06:13:54 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:24.600 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:24.600 06:13:54 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:24.600 06:13:54 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:24.600 00:04:24.600 real 0m9.939s 00:04:24.600 user 0m2.484s 00:04:24.600 sys 0m4.543s 00:04:24.600 06:13:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:24.600 06:13:54 -- common/autotest_common.sh@10 -- # set +x 00:04:24.600 ************************************ 00:04:24.600 END TEST dm_mount 00:04:24.600 ************************************ 00:04:24.600 06:13:54 -- setup/devices.sh@1 -- # cleanup 00:04:24.600 06:13:54 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:24.600 06:13:54 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:24.600 06:13:54 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:24.600 06:13:54 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:24.600 06:13:54 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:24.600 06:13:54 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:24.859 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:24.859 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:24.859 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:24.859 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:24.859 06:13:54 -- setup/devices.sh@12 -- # cleanup_dm 00:04:24.859 06:13:54 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:24.859 06:13:54 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:24.859 06:13:54 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:24.860 06:13:54 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:24.860 06:13:54 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:24.860 06:13:54 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:24.860 00:04:24.860 real 0m26.731s 00:04:24.860 user 0m7.523s 00:04:24.860 sys 0m14.135s 00:04:24.860 06:13:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:24.860 06:13:54 -- common/autotest_common.sh@10 -- # set +x 00:04:24.860 ************************************ 00:04:24.860 END TEST devices 00:04:24.860 ************************************ 00:04:25.119 00:04:25.119 real 1m32.765s 00:04:25.119 user 0m28.678s 00:04:25.119 sys 0m53.022s 00:04:25.119 06:13:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:25.119 06:13:54 -- common/autotest_common.sh@10 -- # set +x 00:04:25.119 ************************************ 00:04:25.119 END TEST setup.sh 00:04:25.119 ************************************ 00:04:25.119 06:13:54 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:28.411 Hugepages 00:04:28.411 node hugesize free / total 00:04:28.411 node0 1048576kB 0 / 0 00:04:28.411 node0 2048kB 2048 / 2048 00:04:28.411 node1 1048576kB 0 / 0 00:04:28.411 node1 2048kB 0 / 0 00:04:28.411 00:04:28.411 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:28.411 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:28.411 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:28.411 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:28.411 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:28.411 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:28.411 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:28.411 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:28.411 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:28.411 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:28.411 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:28.411 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:28.411 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:28.411 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:28.411 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:28.411 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:28.411 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:28.411 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:04:28.411 06:13:57 -- spdk/autotest.sh@128 -- # uname -s 00:04:28.411 06:13:57 -- spdk/autotest.sh@128 -- # [[ Linux == Linux ]] 00:04:28.411 06:13:57 -- spdk/autotest.sh@130 -- # nvme_namespace_revert 00:04:28.411 06:13:57 -- common/autotest_common.sh@1526 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:31.702 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:31.702 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:31.702 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:31.702 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:31.702 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:31.702 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:31.960 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:31.960 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:31.960 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:31.960 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:31.960 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:31.960 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:31.960 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:31.960 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:31.960 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:31.960 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:33.339 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:33.597 06:14:02 -- common/autotest_common.sh@1527 -- # sleep 1 00:04:34.535 06:14:03 -- common/autotest_common.sh@1528 -- # bdfs=() 00:04:34.535 06:14:03 -- common/autotest_common.sh@1528 -- # local bdfs 00:04:34.535 06:14:03 -- common/autotest_common.sh@1529 -- # bdfs=($(get_nvme_bdfs)) 00:04:34.535 06:14:03 -- common/autotest_common.sh@1529 -- # get_nvme_bdfs 00:04:34.535 06:14:03 -- common/autotest_common.sh@1508 -- # bdfs=() 00:04:34.535 06:14:03 -- common/autotest_common.sh@1508 -- # local bdfs 00:04:34.535 06:14:03 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:34.535 06:14:03 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:34.535 06:14:03 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:04:34.795 06:14:04 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:04:34.795 06:14:04 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:04:34.795 06:14:04 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:38.087 Waiting for block devices as requested 00:04:38.087 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:38.087 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:38.087 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:38.087 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:38.347 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:38.347 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:38.347 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:38.347 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:38.606 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:38.606 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:38.606 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:38.866 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:38.866 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:38.866 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:39.125 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:39.125 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:39.125 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:04:39.384 06:14:08 -- common/autotest_common.sh@1533 -- # for bdf in "${bdfs[@]}" 00:04:39.384 06:14:08 -- common/autotest_common.sh@1534 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:04:39.384 06:14:08 -- common/autotest_common.sh@1497 -- # readlink -f /sys/class/nvme/nvme0 00:04:39.384 06:14:08 -- common/autotest_common.sh@1497 -- # grep 0000:d8:00.0/nvme/nvme 00:04:39.384 06:14:08 -- common/autotest_common.sh@1497 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:39.384 06:14:08 -- common/autotest_common.sh@1498 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:04:39.384 06:14:08 -- common/autotest_common.sh@1502 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:39.384 06:14:08 -- common/autotest_common.sh@1502 -- # printf '%s\n' nvme0 00:04:39.384 06:14:08 -- common/autotest_common.sh@1534 -- # nvme_ctrlr=/dev/nvme0 00:04:39.384 06:14:08 -- common/autotest_common.sh@1535 -- # [[ -z /dev/nvme0 ]] 00:04:39.384 06:14:08 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:04:39.384 06:14:08 -- common/autotest_common.sh@1540 -- # grep oacs 00:04:39.384 06:14:08 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:04:39.384 06:14:08 -- common/autotest_common.sh@1540 -- # oacs=' 0xe' 00:04:39.384 06:14:08 -- common/autotest_common.sh@1541 -- # oacs_ns_manage=8 00:04:39.384 06:14:08 -- common/autotest_common.sh@1543 -- # [[ 8 -ne 0 ]] 00:04:39.384 06:14:08 -- common/autotest_common.sh@1549 -- # nvme id-ctrl /dev/nvme0 00:04:39.384 06:14:08 -- common/autotest_common.sh@1549 -- # grep unvmcap 00:04:39.384 06:14:08 -- common/autotest_common.sh@1549 -- # cut -d: -f2 00:04:39.384 06:14:08 -- common/autotest_common.sh@1549 -- # unvmcap=' 0' 00:04:39.384 06:14:08 -- common/autotest_common.sh@1550 -- # [[ 0 -eq 0 ]] 00:04:39.384 06:14:08 -- common/autotest_common.sh@1552 -- # continue 00:04:39.384 06:14:08 -- spdk/autotest.sh@133 -- # timing_exit pre_cleanup 00:04:39.384 06:14:08 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:39.384 06:14:08 -- common/autotest_common.sh@10 -- # set +x 00:04:39.384 06:14:08 -- spdk/autotest.sh@136 -- # timing_enter afterboot 00:04:39.384 06:14:08 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:39.384 06:14:08 -- common/autotest_common.sh@10 -- # set +x 00:04:39.384 06:14:08 -- spdk/autotest.sh@137 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:42.676 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:42.676 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:42.676 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:42.676 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:42.676 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:42.676 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:42.676 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:42.935 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:42.935 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:42.935 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:42.935 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:42.935 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:42.935 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:42.935 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:42.935 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:42.935 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:44.843 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:44.843 06:14:13 -- spdk/autotest.sh@138 -- # timing_exit afterboot 00:04:44.843 06:14:13 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:44.843 06:14:13 -- common/autotest_common.sh@10 -- # set +x 00:04:44.843 06:14:14 -- spdk/autotest.sh@142 -- # opal_revert_cleanup 00:04:44.843 06:14:14 -- common/autotest_common.sh@1586 -- # mapfile -t bdfs 00:04:44.843 06:14:14 -- common/autotest_common.sh@1586 -- # get_nvme_bdfs_by_id 0x0a54 00:04:44.843 06:14:14 -- common/autotest_common.sh@1572 -- # bdfs=() 00:04:44.843 06:14:14 -- common/autotest_common.sh@1572 -- # local bdfs 00:04:44.843 06:14:14 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs 00:04:44.843 06:14:14 -- common/autotest_common.sh@1508 -- # bdfs=() 00:04:44.843 06:14:14 -- common/autotest_common.sh@1508 -- # local bdfs 00:04:44.843 06:14:14 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:44.843 06:14:14 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:44.843 06:14:14 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:04:44.843 06:14:14 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:04:44.843 06:14:14 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:04:44.843 06:14:14 -- common/autotest_common.sh@1574 -- # for bdf in $(get_nvme_bdfs) 00:04:44.843 06:14:14 -- common/autotest_common.sh@1575 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:04:44.843 06:14:14 -- common/autotest_common.sh@1575 -- # device=0x0a54 00:04:44.843 06:14:14 -- common/autotest_common.sh@1576 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:04:44.843 06:14:14 -- common/autotest_common.sh@1577 -- # bdfs+=($bdf) 00:04:44.843 06:14:14 -- common/autotest_common.sh@1581 -- # printf '%s\n' 0000:d8:00.0 00:04:44.843 06:14:14 -- common/autotest_common.sh@1587 -- # [[ -z 0000:d8:00.0 ]] 00:04:44.843 06:14:14 -- common/autotest_common.sh@1592 -- # spdk_tgt_pid=2624 00:04:44.843 06:14:14 -- common/autotest_common.sh@1593 -- # waitforlisten 2624 00:04:44.843 06:14:14 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:44.843 06:14:14 -- common/autotest_common.sh@829 -- # '[' -z 2624 ']' 00:04:44.843 06:14:14 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:44.843 06:14:14 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:44.843 06:14:14 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:44.843 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:44.843 06:14:14 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:44.843 06:14:14 -- common/autotest_common.sh@10 -- # set +x 00:04:44.843 [2024-11-27 06:14:14.152871] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:44.843 [2024-11-27 06:14:14.152927] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2624 ] 00:04:44.843 EAL: No free 2048 kB hugepages reported on node 1 00:04:44.843 [2024-11-27 06:14:14.220289] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:44.843 [2024-11-27 06:14:14.294887] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:44.843 [2024-11-27 06:14:14.295011] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:45.781 06:14:14 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:45.781 06:14:14 -- common/autotest_common.sh@862 -- # return 0 00:04:45.781 06:14:14 -- common/autotest_common.sh@1595 -- # bdf_id=0 00:04:45.781 06:14:14 -- common/autotest_common.sh@1596 -- # for bdf in "${bdfs[@]}" 00:04:45.781 06:14:14 -- common/autotest_common.sh@1597 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:04:49.071 nvme0n1 00:04:49.071 06:14:17 -- common/autotest_common.sh@1599 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:04:49.071 [2024-11-27 06:14:18.157228] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:04:49.071 request: 00:04:49.071 { 00:04:49.071 "nvme_ctrlr_name": "nvme0", 00:04:49.071 "password": "test", 00:04:49.071 "method": "bdev_nvme_opal_revert", 00:04:49.071 "req_id": 1 00:04:49.071 } 00:04:49.071 Got JSON-RPC error response 00:04:49.071 response: 00:04:49.071 { 00:04:49.071 "code": -32602, 00:04:49.071 "message": "Invalid parameters" 00:04:49.071 } 00:04:49.071 06:14:18 -- common/autotest_common.sh@1599 -- # true 00:04:49.071 06:14:18 -- common/autotest_common.sh@1600 -- # (( ++bdf_id )) 00:04:49.071 06:14:18 -- common/autotest_common.sh@1603 -- # killprocess 2624 00:04:49.071 06:14:18 -- common/autotest_common.sh@936 -- # '[' -z 2624 ']' 00:04:49.071 06:14:18 -- common/autotest_common.sh@940 -- # kill -0 2624 00:04:49.071 06:14:18 -- common/autotest_common.sh@941 -- # uname 00:04:49.071 06:14:18 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:49.071 06:14:18 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2624 00:04:49.071 06:14:18 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:49.071 06:14:18 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:49.071 06:14:18 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2624' 00:04:49.071 killing process with pid 2624 00:04:49.071 06:14:18 -- common/autotest_common.sh@955 -- # kill 2624 00:04:49.071 06:14:18 -- common/autotest_common.sh@960 -- # wait 2624 00:04:50.978 06:14:20 -- spdk/autotest.sh@148 -- # '[' 0 -eq 1 ']' 00:04:50.978 06:14:20 -- spdk/autotest.sh@152 -- # '[' 1 -eq 1 ']' 00:04:50.978 06:14:20 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:04:50.978 06:14:20 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:04:50.978 06:14:20 -- spdk/autotest.sh@160 -- # timing_enter lib 00:04:50.978 06:14:20 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:50.978 06:14:20 -- common/autotest_common.sh@10 -- # set +x 00:04:50.978 06:14:20 -- spdk/autotest.sh@162 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:04:50.978 06:14:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:50.978 06:14:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:50.978 06:14:20 -- common/autotest_common.sh@10 -- # set +x 00:04:50.978 ************************************ 00:04:50.978 START TEST env 00:04:50.978 ************************************ 00:04:50.978 06:14:20 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:04:50.978 * Looking for test storage... 00:04:50.978 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:04:50.978 06:14:20 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:50.978 06:14:20 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:50.978 06:14:20 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:50.978 06:14:20 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:50.978 06:14:20 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:50.978 06:14:20 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:50.978 06:14:20 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:50.978 06:14:20 -- scripts/common.sh@335 -- # IFS=.-: 00:04:50.978 06:14:20 -- scripts/common.sh@335 -- # read -ra ver1 00:04:50.978 06:14:20 -- scripts/common.sh@336 -- # IFS=.-: 00:04:50.978 06:14:20 -- scripts/common.sh@336 -- # read -ra ver2 00:04:50.978 06:14:20 -- scripts/common.sh@337 -- # local 'op=<' 00:04:50.978 06:14:20 -- scripts/common.sh@339 -- # ver1_l=2 00:04:50.978 06:14:20 -- scripts/common.sh@340 -- # ver2_l=1 00:04:50.978 06:14:20 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:50.978 06:14:20 -- scripts/common.sh@343 -- # case "$op" in 00:04:50.978 06:14:20 -- scripts/common.sh@344 -- # : 1 00:04:50.978 06:14:20 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:50.978 06:14:20 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:50.978 06:14:20 -- scripts/common.sh@364 -- # decimal 1 00:04:50.978 06:14:20 -- scripts/common.sh@352 -- # local d=1 00:04:50.978 06:14:20 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:50.978 06:14:20 -- scripts/common.sh@354 -- # echo 1 00:04:50.978 06:14:20 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:50.978 06:14:20 -- scripts/common.sh@365 -- # decimal 2 00:04:50.978 06:14:20 -- scripts/common.sh@352 -- # local d=2 00:04:50.978 06:14:20 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:50.978 06:14:20 -- scripts/common.sh@354 -- # echo 2 00:04:51.240 06:14:20 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:51.240 06:14:20 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:51.240 06:14:20 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:51.240 06:14:20 -- scripts/common.sh@367 -- # return 0 00:04:51.240 06:14:20 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:51.240 06:14:20 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:51.240 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:51.240 --rc genhtml_branch_coverage=1 00:04:51.240 --rc genhtml_function_coverage=1 00:04:51.240 --rc genhtml_legend=1 00:04:51.240 --rc geninfo_all_blocks=1 00:04:51.240 --rc geninfo_unexecuted_blocks=1 00:04:51.240 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:51.240 ' 00:04:51.240 06:14:20 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:51.240 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:51.240 --rc genhtml_branch_coverage=1 00:04:51.240 --rc genhtml_function_coverage=1 00:04:51.240 --rc genhtml_legend=1 00:04:51.240 --rc geninfo_all_blocks=1 00:04:51.240 --rc geninfo_unexecuted_blocks=1 00:04:51.240 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:51.240 ' 00:04:51.240 06:14:20 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:51.240 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:51.240 --rc genhtml_branch_coverage=1 00:04:51.240 --rc genhtml_function_coverage=1 00:04:51.240 --rc genhtml_legend=1 00:04:51.240 --rc geninfo_all_blocks=1 00:04:51.240 --rc geninfo_unexecuted_blocks=1 00:04:51.240 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:51.240 ' 00:04:51.240 06:14:20 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:51.240 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:51.240 --rc genhtml_branch_coverage=1 00:04:51.240 --rc genhtml_function_coverage=1 00:04:51.240 --rc genhtml_legend=1 00:04:51.240 --rc geninfo_all_blocks=1 00:04:51.240 --rc geninfo_unexecuted_blocks=1 00:04:51.240 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:51.240 ' 00:04:51.240 06:14:20 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:04:51.240 06:14:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:51.240 06:14:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:51.240 06:14:20 -- common/autotest_common.sh@10 -- # set +x 00:04:51.240 ************************************ 00:04:51.240 START TEST env_memory 00:04:51.240 ************************************ 00:04:51.240 06:14:20 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:04:51.240 00:04:51.240 00:04:51.240 CUnit - A unit testing framework for C - Version 2.1-3 00:04:51.240 http://cunit.sourceforge.net/ 00:04:51.240 00:04:51.240 00:04:51.240 Suite: memory 00:04:51.240 Test: alloc and free memory map ...[2024-11-27 06:14:20.541445] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:04:51.240 passed 00:04:51.240 Test: mem map translation ...[2024-11-27 06:14:20.554455] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:04:51.240 [2024-11-27 06:14:20.554476] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:04:51.240 [2024-11-27 06:14:20.554507] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:04:51.240 [2024-11-27 06:14:20.554516] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:04:51.240 passed 00:04:51.240 Test: mem map registration ...[2024-11-27 06:14:20.575253] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:04:51.240 [2024-11-27 06:14:20.575269] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:04:51.240 passed 00:04:51.240 Test: mem map adjacent registrations ...passed 00:04:51.240 00:04:51.240 Run Summary: Type Total Ran Passed Failed Inactive 00:04:51.240 suites 1 1 n/a 0 0 00:04:51.240 tests 4 4 4 0 0 00:04:51.240 asserts 152 152 152 0 n/a 00:04:51.240 00:04:51.240 Elapsed time = 0.075 seconds 00:04:51.240 00:04:51.240 real 0m0.082s 00:04:51.240 user 0m0.076s 00:04:51.240 sys 0m0.006s 00:04:51.240 06:14:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:51.240 06:14:20 -- common/autotest_common.sh@10 -- # set +x 00:04:51.240 ************************************ 00:04:51.240 END TEST env_memory 00:04:51.240 ************************************ 00:04:51.240 06:14:20 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:51.240 06:14:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:51.240 06:14:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:51.240 06:14:20 -- common/autotest_common.sh@10 -- # set +x 00:04:51.240 ************************************ 00:04:51.240 START TEST env_vtophys 00:04:51.240 ************************************ 00:04:51.240 06:14:20 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:51.240 EAL: lib.eal log level changed from notice to debug 00:04:51.240 EAL: Detected lcore 0 as core 0 on socket 0 00:04:51.240 EAL: Detected lcore 1 as core 1 on socket 0 00:04:51.240 EAL: Detected lcore 2 as core 2 on socket 0 00:04:51.240 EAL: Detected lcore 3 as core 3 on socket 0 00:04:51.240 EAL: Detected lcore 4 as core 4 on socket 0 00:04:51.240 EAL: Detected lcore 5 as core 5 on socket 0 00:04:51.240 EAL: Detected lcore 6 as core 6 on socket 0 00:04:51.240 EAL: Detected lcore 7 as core 8 on socket 0 00:04:51.240 EAL: Detected lcore 8 as core 9 on socket 0 00:04:51.240 EAL: Detected lcore 9 as core 10 on socket 0 00:04:51.240 EAL: Detected lcore 10 as core 11 on socket 0 00:04:51.240 EAL: Detected lcore 11 as core 12 on socket 0 00:04:51.240 EAL: Detected lcore 12 as core 13 on socket 0 00:04:51.240 EAL: Detected lcore 13 as core 14 on socket 0 00:04:51.240 EAL: Detected lcore 14 as core 16 on socket 0 00:04:51.240 EAL: Detected lcore 15 as core 17 on socket 0 00:04:51.240 EAL: Detected lcore 16 as core 18 on socket 0 00:04:51.240 EAL: Detected lcore 17 as core 19 on socket 0 00:04:51.240 EAL: Detected lcore 18 as core 20 on socket 0 00:04:51.240 EAL: Detected lcore 19 as core 21 on socket 0 00:04:51.240 EAL: Detected lcore 20 as core 22 on socket 0 00:04:51.240 EAL: Detected lcore 21 as core 24 on socket 0 00:04:51.240 EAL: Detected lcore 22 as core 25 on socket 0 00:04:51.240 EAL: Detected lcore 23 as core 26 on socket 0 00:04:51.240 EAL: Detected lcore 24 as core 27 on socket 0 00:04:51.240 EAL: Detected lcore 25 as core 28 on socket 0 00:04:51.240 EAL: Detected lcore 26 as core 29 on socket 0 00:04:51.240 EAL: Detected lcore 27 as core 30 on socket 0 00:04:51.240 EAL: Detected lcore 28 as core 0 on socket 1 00:04:51.240 EAL: Detected lcore 29 as core 1 on socket 1 00:04:51.240 EAL: Detected lcore 30 as core 2 on socket 1 00:04:51.240 EAL: Detected lcore 31 as core 3 on socket 1 00:04:51.240 EAL: Detected lcore 32 as core 4 on socket 1 00:04:51.240 EAL: Detected lcore 33 as core 5 on socket 1 00:04:51.240 EAL: Detected lcore 34 as core 6 on socket 1 00:04:51.240 EAL: Detected lcore 35 as core 8 on socket 1 00:04:51.240 EAL: Detected lcore 36 as core 9 on socket 1 00:04:51.240 EAL: Detected lcore 37 as core 10 on socket 1 00:04:51.240 EAL: Detected lcore 38 as core 11 on socket 1 00:04:51.240 EAL: Detected lcore 39 as core 12 on socket 1 00:04:51.240 EAL: Detected lcore 40 as core 13 on socket 1 00:04:51.240 EAL: Detected lcore 41 as core 14 on socket 1 00:04:51.240 EAL: Detected lcore 42 as core 16 on socket 1 00:04:51.240 EAL: Detected lcore 43 as core 17 on socket 1 00:04:51.240 EAL: Detected lcore 44 as core 18 on socket 1 00:04:51.240 EAL: Detected lcore 45 as core 19 on socket 1 00:04:51.240 EAL: Detected lcore 46 as core 20 on socket 1 00:04:51.240 EAL: Detected lcore 47 as core 21 on socket 1 00:04:51.240 EAL: Detected lcore 48 as core 22 on socket 1 00:04:51.240 EAL: Detected lcore 49 as core 24 on socket 1 00:04:51.240 EAL: Detected lcore 50 as core 25 on socket 1 00:04:51.240 EAL: Detected lcore 51 as core 26 on socket 1 00:04:51.240 EAL: Detected lcore 52 as core 27 on socket 1 00:04:51.240 EAL: Detected lcore 53 as core 28 on socket 1 00:04:51.240 EAL: Detected lcore 54 as core 29 on socket 1 00:04:51.240 EAL: Detected lcore 55 as core 30 on socket 1 00:04:51.240 EAL: Detected lcore 56 as core 0 on socket 0 00:04:51.240 EAL: Detected lcore 57 as core 1 on socket 0 00:04:51.240 EAL: Detected lcore 58 as core 2 on socket 0 00:04:51.240 EAL: Detected lcore 59 as core 3 on socket 0 00:04:51.240 EAL: Detected lcore 60 as core 4 on socket 0 00:04:51.240 EAL: Detected lcore 61 as core 5 on socket 0 00:04:51.240 EAL: Detected lcore 62 as core 6 on socket 0 00:04:51.240 EAL: Detected lcore 63 as core 8 on socket 0 00:04:51.240 EAL: Detected lcore 64 as core 9 on socket 0 00:04:51.240 EAL: Detected lcore 65 as core 10 on socket 0 00:04:51.240 EAL: Detected lcore 66 as core 11 on socket 0 00:04:51.240 EAL: Detected lcore 67 as core 12 on socket 0 00:04:51.240 EAL: Detected lcore 68 as core 13 on socket 0 00:04:51.240 EAL: Detected lcore 69 as core 14 on socket 0 00:04:51.240 EAL: Detected lcore 70 as core 16 on socket 0 00:04:51.240 EAL: Detected lcore 71 as core 17 on socket 0 00:04:51.240 EAL: Detected lcore 72 as core 18 on socket 0 00:04:51.241 EAL: Detected lcore 73 as core 19 on socket 0 00:04:51.241 EAL: Detected lcore 74 as core 20 on socket 0 00:04:51.241 EAL: Detected lcore 75 as core 21 on socket 0 00:04:51.241 EAL: Detected lcore 76 as core 22 on socket 0 00:04:51.241 EAL: Detected lcore 77 as core 24 on socket 0 00:04:51.241 EAL: Detected lcore 78 as core 25 on socket 0 00:04:51.241 EAL: Detected lcore 79 as core 26 on socket 0 00:04:51.241 EAL: Detected lcore 80 as core 27 on socket 0 00:04:51.241 EAL: Detected lcore 81 as core 28 on socket 0 00:04:51.241 EAL: Detected lcore 82 as core 29 on socket 0 00:04:51.241 EAL: Detected lcore 83 as core 30 on socket 0 00:04:51.241 EAL: Detected lcore 84 as core 0 on socket 1 00:04:51.241 EAL: Detected lcore 85 as core 1 on socket 1 00:04:51.241 EAL: Detected lcore 86 as core 2 on socket 1 00:04:51.241 EAL: Detected lcore 87 as core 3 on socket 1 00:04:51.241 EAL: Detected lcore 88 as core 4 on socket 1 00:04:51.241 EAL: Detected lcore 89 as core 5 on socket 1 00:04:51.241 EAL: Detected lcore 90 as core 6 on socket 1 00:04:51.241 EAL: Detected lcore 91 as core 8 on socket 1 00:04:51.241 EAL: Detected lcore 92 as core 9 on socket 1 00:04:51.241 EAL: Detected lcore 93 as core 10 on socket 1 00:04:51.241 EAL: Detected lcore 94 as core 11 on socket 1 00:04:51.241 EAL: Detected lcore 95 as core 12 on socket 1 00:04:51.241 EAL: Detected lcore 96 as core 13 on socket 1 00:04:51.241 EAL: Detected lcore 97 as core 14 on socket 1 00:04:51.241 EAL: Detected lcore 98 as core 16 on socket 1 00:04:51.241 EAL: Detected lcore 99 as core 17 on socket 1 00:04:51.241 EAL: Detected lcore 100 as core 18 on socket 1 00:04:51.241 EAL: Detected lcore 101 as core 19 on socket 1 00:04:51.241 EAL: Detected lcore 102 as core 20 on socket 1 00:04:51.241 EAL: Detected lcore 103 as core 21 on socket 1 00:04:51.241 EAL: Detected lcore 104 as core 22 on socket 1 00:04:51.241 EAL: Detected lcore 105 as core 24 on socket 1 00:04:51.241 EAL: Detected lcore 106 as core 25 on socket 1 00:04:51.241 EAL: Detected lcore 107 as core 26 on socket 1 00:04:51.241 EAL: Detected lcore 108 as core 27 on socket 1 00:04:51.241 EAL: Detected lcore 109 as core 28 on socket 1 00:04:51.241 EAL: Detected lcore 110 as core 29 on socket 1 00:04:51.241 EAL: Detected lcore 111 as core 30 on socket 1 00:04:51.241 EAL: Maximum logical cores by configuration: 128 00:04:51.241 EAL: Detected CPU lcores: 112 00:04:51.241 EAL: Detected NUMA nodes: 2 00:04:51.241 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:04:51.241 EAL: Checking presence of .so 'librte_eal.so.24' 00:04:51.241 EAL: Checking presence of .so 'librte_eal.so' 00:04:51.241 EAL: Detected static linkage of DPDK 00:04:51.241 EAL: No shared files mode enabled, IPC will be disabled 00:04:51.241 EAL: Bus pci wants IOVA as 'DC' 00:04:51.241 EAL: Buses did not request a specific IOVA mode. 00:04:51.241 EAL: IOMMU is available, selecting IOVA as VA mode. 00:04:51.241 EAL: Selected IOVA mode 'VA' 00:04:51.241 EAL: No free 2048 kB hugepages reported on node 1 00:04:51.241 EAL: Probing VFIO support... 00:04:51.241 EAL: IOMMU type 1 (Type 1) is supported 00:04:51.241 EAL: IOMMU type 7 (sPAPR) is not supported 00:04:51.241 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:04:51.241 EAL: VFIO support initialized 00:04:51.241 EAL: Ask a virtual area of 0x2e000 bytes 00:04:51.241 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:04:51.241 EAL: Setting up physically contiguous memory... 00:04:51.241 EAL: Setting maximum number of open files to 524288 00:04:51.241 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:04:51.241 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:04:51.241 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:04:51.241 EAL: Ask a virtual area of 0x61000 bytes 00:04:51.241 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:04:51.241 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:51.241 EAL: Ask a virtual area of 0x400000000 bytes 00:04:51.241 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:04:51.241 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:04:51.241 EAL: Ask a virtual area of 0x61000 bytes 00:04:51.241 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:04:51.241 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:51.241 EAL: Ask a virtual area of 0x400000000 bytes 00:04:51.241 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:04:51.241 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:04:51.241 EAL: Ask a virtual area of 0x61000 bytes 00:04:51.241 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:04:51.241 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:51.241 EAL: Ask a virtual area of 0x400000000 bytes 00:04:51.241 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:04:51.241 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:04:51.241 EAL: Ask a virtual area of 0x61000 bytes 00:04:51.241 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:04:51.241 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:51.241 EAL: Ask a virtual area of 0x400000000 bytes 00:04:51.241 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:04:51.241 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:04:51.241 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:04:51.241 EAL: Ask a virtual area of 0x61000 bytes 00:04:51.241 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:04:51.241 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:51.241 EAL: Ask a virtual area of 0x400000000 bytes 00:04:51.241 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:04:51.241 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:04:51.241 EAL: Ask a virtual area of 0x61000 bytes 00:04:51.241 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:04:51.241 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:51.241 EAL: Ask a virtual area of 0x400000000 bytes 00:04:51.241 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:04:51.241 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:04:51.241 EAL: Ask a virtual area of 0x61000 bytes 00:04:51.241 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:04:51.241 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:51.241 EAL: Ask a virtual area of 0x400000000 bytes 00:04:51.241 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:04:51.241 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:04:51.241 EAL: Ask a virtual area of 0x61000 bytes 00:04:51.241 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:04:51.241 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:51.241 EAL: Ask a virtual area of 0x400000000 bytes 00:04:51.241 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:04:51.241 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:04:51.241 EAL: Hugepages will be freed exactly as allocated. 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: TSC frequency is ~2500000 KHz 00:04:51.241 EAL: Main lcore 0 is ready (tid=7f6a8d109a00;cpuset=[0]) 00:04:51.241 EAL: Trying to obtain current memory policy. 00:04:51.241 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:51.241 EAL: Restoring previous memory policy: 0 00:04:51.241 EAL: request: mp_malloc_sync 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: Heap on socket 0 was expanded by 2MB 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: Mem event callback 'spdk:(nil)' registered 00:04:51.241 00:04:51.241 00:04:51.241 CUnit - A unit testing framework for C - Version 2.1-3 00:04:51.241 http://cunit.sourceforge.net/ 00:04:51.241 00:04:51.241 00:04:51.241 Suite: components_suite 00:04:51.241 Test: vtophys_malloc_test ...passed 00:04:51.241 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:04:51.241 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:51.241 EAL: Restoring previous memory policy: 4 00:04:51.241 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.241 EAL: request: mp_malloc_sync 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: Heap on socket 0 was expanded by 4MB 00:04:51.241 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.241 EAL: request: mp_malloc_sync 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: Heap on socket 0 was shrunk by 4MB 00:04:51.241 EAL: Trying to obtain current memory policy. 00:04:51.241 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:51.241 EAL: Restoring previous memory policy: 4 00:04:51.241 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.241 EAL: request: mp_malloc_sync 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: Heap on socket 0 was expanded by 6MB 00:04:51.241 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.241 EAL: request: mp_malloc_sync 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: Heap on socket 0 was shrunk by 6MB 00:04:51.241 EAL: Trying to obtain current memory policy. 00:04:51.241 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:51.241 EAL: Restoring previous memory policy: 4 00:04:51.241 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.241 EAL: request: mp_malloc_sync 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: Heap on socket 0 was expanded by 10MB 00:04:51.241 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.241 EAL: request: mp_malloc_sync 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: Heap on socket 0 was shrunk by 10MB 00:04:51.241 EAL: Trying to obtain current memory policy. 00:04:51.241 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:51.241 EAL: Restoring previous memory policy: 4 00:04:51.241 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.241 EAL: request: mp_malloc_sync 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: Heap on socket 0 was expanded by 18MB 00:04:51.241 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.241 EAL: request: mp_malloc_sync 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: Heap on socket 0 was shrunk by 18MB 00:04:51.241 EAL: Trying to obtain current memory policy. 00:04:51.241 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:51.241 EAL: Restoring previous memory policy: 4 00:04:51.241 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.241 EAL: request: mp_malloc_sync 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: Heap on socket 0 was expanded by 34MB 00:04:51.241 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.241 EAL: request: mp_malloc_sync 00:04:51.241 EAL: No shared files mode enabled, IPC is disabled 00:04:51.241 EAL: Heap on socket 0 was shrunk by 34MB 00:04:51.241 EAL: Trying to obtain current memory policy. 00:04:51.242 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:51.242 EAL: Restoring previous memory policy: 4 00:04:51.242 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.242 EAL: request: mp_malloc_sync 00:04:51.242 EAL: No shared files mode enabled, IPC is disabled 00:04:51.242 EAL: Heap on socket 0 was expanded by 66MB 00:04:51.242 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.502 EAL: request: mp_malloc_sync 00:04:51.502 EAL: No shared files mode enabled, IPC is disabled 00:04:51.502 EAL: Heap on socket 0 was shrunk by 66MB 00:04:51.502 EAL: Trying to obtain current memory policy. 00:04:51.502 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:51.502 EAL: Restoring previous memory policy: 4 00:04:51.502 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.502 EAL: request: mp_malloc_sync 00:04:51.502 EAL: No shared files mode enabled, IPC is disabled 00:04:51.502 EAL: Heap on socket 0 was expanded by 130MB 00:04:51.502 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.502 EAL: request: mp_malloc_sync 00:04:51.502 EAL: No shared files mode enabled, IPC is disabled 00:04:51.502 EAL: Heap on socket 0 was shrunk by 130MB 00:04:51.502 EAL: Trying to obtain current memory policy. 00:04:51.502 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:51.502 EAL: Restoring previous memory policy: 4 00:04:51.502 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.502 EAL: request: mp_malloc_sync 00:04:51.502 EAL: No shared files mode enabled, IPC is disabled 00:04:51.502 EAL: Heap on socket 0 was expanded by 258MB 00:04:51.502 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.502 EAL: request: mp_malloc_sync 00:04:51.502 EAL: No shared files mode enabled, IPC is disabled 00:04:51.502 EAL: Heap on socket 0 was shrunk by 258MB 00:04:51.502 EAL: Trying to obtain current memory policy. 00:04:51.502 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:51.762 EAL: Restoring previous memory policy: 4 00:04:51.762 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.762 EAL: request: mp_malloc_sync 00:04:51.762 EAL: No shared files mode enabled, IPC is disabled 00:04:51.762 EAL: Heap on socket 0 was expanded by 514MB 00:04:51.762 EAL: Calling mem event callback 'spdk:(nil)' 00:04:51.762 EAL: request: mp_malloc_sync 00:04:51.762 EAL: No shared files mode enabled, IPC is disabled 00:04:51.762 EAL: Heap on socket 0 was shrunk by 514MB 00:04:51.762 EAL: Trying to obtain current memory policy. 00:04:51.762 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:52.021 EAL: Restoring previous memory policy: 4 00:04:52.021 EAL: Calling mem event callback 'spdk:(nil)' 00:04:52.021 EAL: request: mp_malloc_sync 00:04:52.021 EAL: No shared files mode enabled, IPC is disabled 00:04:52.021 EAL: Heap on socket 0 was expanded by 1026MB 00:04:52.281 EAL: Calling mem event callback 'spdk:(nil)' 00:04:52.281 EAL: request: mp_malloc_sync 00:04:52.281 EAL: No shared files mode enabled, IPC is disabled 00:04:52.281 EAL: Heap on socket 0 was shrunk by 1026MB 00:04:52.281 passed 00:04:52.281 00:04:52.281 Run Summary: Type Total Ran Passed Failed Inactive 00:04:52.281 suites 1 1 n/a 0 0 00:04:52.281 tests 2 2 2 0 0 00:04:52.281 asserts 497 497 497 0 n/a 00:04:52.281 00:04:52.281 Elapsed time = 0.960 seconds 00:04:52.281 EAL: Calling mem event callback 'spdk:(nil)' 00:04:52.281 EAL: request: mp_malloc_sync 00:04:52.281 EAL: No shared files mode enabled, IPC is disabled 00:04:52.281 EAL: Heap on socket 0 was shrunk by 2MB 00:04:52.281 EAL: No shared files mode enabled, IPC is disabled 00:04:52.281 EAL: No shared files mode enabled, IPC is disabled 00:04:52.281 EAL: No shared files mode enabled, IPC is disabled 00:04:52.281 00:04:52.281 real 0m1.083s 00:04:52.281 user 0m0.624s 00:04:52.281 sys 0m0.426s 00:04:52.281 06:14:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:52.281 06:14:21 -- common/autotest_common.sh@10 -- # set +x 00:04:52.281 ************************************ 00:04:52.281 END TEST env_vtophys 00:04:52.281 ************************************ 00:04:52.281 06:14:21 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:04:52.281 06:14:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:52.281 06:14:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:52.281 06:14:21 -- common/autotest_common.sh@10 -- # set +x 00:04:52.281 ************************************ 00:04:52.281 START TEST env_pci 00:04:52.281 ************************************ 00:04:52.281 06:14:21 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:04:52.281 00:04:52.281 00:04:52.281 CUnit - A unit testing framework for C - Version 2.1-3 00:04:52.281 http://cunit.sourceforge.net/ 00:04:52.282 00:04:52.282 00:04:52.282 Suite: pci 00:04:52.282 Test: pci_hook ...[2024-11-27 06:14:21.791267] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 4258 has claimed it 00:04:52.541 EAL: Cannot find device (10000:00:01.0) 00:04:52.541 EAL: Failed to attach device on primary process 00:04:52.541 passed 00:04:52.541 00:04:52.541 Run Summary: Type Total Ran Passed Failed Inactive 00:04:52.541 suites 1 1 n/a 0 0 00:04:52.541 tests 1 1 1 0 0 00:04:52.541 asserts 25 25 25 0 n/a 00:04:52.541 00:04:52.541 Elapsed time = 0.036 seconds 00:04:52.541 00:04:52.541 real 0m0.056s 00:04:52.541 user 0m0.016s 00:04:52.541 sys 0m0.040s 00:04:52.542 06:14:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:52.542 06:14:21 -- common/autotest_common.sh@10 -- # set +x 00:04:52.542 ************************************ 00:04:52.542 END TEST env_pci 00:04:52.542 ************************************ 00:04:52.542 06:14:21 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:04:52.542 06:14:21 -- env/env.sh@15 -- # uname 00:04:52.542 06:14:21 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:04:52.542 06:14:21 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:04:52.542 06:14:21 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:52.542 06:14:21 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:04:52.542 06:14:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:52.542 06:14:21 -- common/autotest_common.sh@10 -- # set +x 00:04:52.542 ************************************ 00:04:52.542 START TEST env_dpdk_post_init 00:04:52.542 ************************************ 00:04:52.542 06:14:21 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:52.542 EAL: Detected CPU lcores: 112 00:04:52.542 EAL: Detected NUMA nodes: 2 00:04:52.542 EAL: Detected static linkage of DPDK 00:04:52.542 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:52.542 EAL: Selected IOVA mode 'VA' 00:04:52.542 EAL: No free 2048 kB hugepages reported on node 1 00:04:52.542 EAL: VFIO support initialized 00:04:52.542 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:52.542 EAL: Using IOMMU type 1 (Type 1) 00:04:53.480 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:04:56.769 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:04:56.769 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:04:57.337 Starting DPDK initialization... 00:04:57.337 Starting SPDK post initialization... 00:04:57.337 SPDK NVMe probe 00:04:57.337 Attaching to 0000:d8:00.0 00:04:57.337 Attached to 0000:d8:00.0 00:04:57.337 Cleaning up... 00:04:57.337 00:04:57.337 real 0m4.732s 00:04:57.337 user 0m3.576s 00:04:57.337 sys 0m0.398s 00:04:57.337 06:14:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:57.337 06:14:26 -- common/autotest_common.sh@10 -- # set +x 00:04:57.337 ************************************ 00:04:57.337 END TEST env_dpdk_post_init 00:04:57.337 ************************************ 00:04:57.337 06:14:26 -- env/env.sh@26 -- # uname 00:04:57.337 06:14:26 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:04:57.337 06:14:26 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:04:57.337 06:14:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:57.337 06:14:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:57.337 06:14:26 -- common/autotest_common.sh@10 -- # set +x 00:04:57.337 ************************************ 00:04:57.337 START TEST env_mem_callbacks 00:04:57.337 ************************************ 00:04:57.337 06:14:26 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:04:57.337 EAL: Detected CPU lcores: 112 00:04:57.337 EAL: Detected NUMA nodes: 2 00:04:57.337 EAL: Detected static linkage of DPDK 00:04:57.337 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:57.337 EAL: Selected IOVA mode 'VA' 00:04:57.337 EAL: No free 2048 kB hugepages reported on node 1 00:04:57.337 EAL: VFIO support initialized 00:04:57.337 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:57.337 00:04:57.337 00:04:57.337 CUnit - A unit testing framework for C - Version 2.1-3 00:04:57.337 http://cunit.sourceforge.net/ 00:04:57.337 00:04:57.337 00:04:57.337 Suite: memory 00:04:57.337 Test: test ... 00:04:57.337 register 0x200000200000 2097152 00:04:57.337 malloc 3145728 00:04:57.337 register 0x200000400000 4194304 00:04:57.337 buf 0x200000500000 len 3145728 PASSED 00:04:57.337 malloc 64 00:04:57.337 buf 0x2000004fff40 len 64 PASSED 00:04:57.337 malloc 4194304 00:04:57.337 register 0x200000800000 6291456 00:04:57.337 buf 0x200000a00000 len 4194304 PASSED 00:04:57.337 free 0x200000500000 3145728 00:04:57.337 free 0x2000004fff40 64 00:04:57.337 unregister 0x200000400000 4194304 PASSED 00:04:57.338 free 0x200000a00000 4194304 00:04:57.338 unregister 0x200000800000 6291456 PASSED 00:04:57.338 malloc 8388608 00:04:57.338 register 0x200000400000 10485760 00:04:57.338 buf 0x200000600000 len 8388608 PASSED 00:04:57.338 free 0x200000600000 8388608 00:04:57.338 unregister 0x200000400000 10485760 PASSED 00:04:57.338 passed 00:04:57.338 00:04:57.338 Run Summary: Type Total Ran Passed Failed Inactive 00:04:57.338 suites 1 1 n/a 0 0 00:04:57.338 tests 1 1 1 0 0 00:04:57.338 asserts 15 15 15 0 n/a 00:04:57.338 00:04:57.338 Elapsed time = 0.005 seconds 00:04:57.338 00:04:57.338 real 0m0.065s 00:04:57.338 user 0m0.017s 00:04:57.338 sys 0m0.048s 00:04:57.338 06:14:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:57.338 06:14:26 -- common/autotest_common.sh@10 -- # set +x 00:04:57.338 ************************************ 00:04:57.338 END TEST env_mem_callbacks 00:04:57.338 ************************************ 00:04:57.338 00:04:57.338 real 0m6.411s 00:04:57.338 user 0m4.482s 00:04:57.338 sys 0m1.192s 00:04:57.338 06:14:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:57.338 06:14:26 -- common/autotest_common.sh@10 -- # set +x 00:04:57.338 ************************************ 00:04:57.338 END TEST env 00:04:57.338 ************************************ 00:04:57.338 06:14:26 -- spdk/autotest.sh@163 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:04:57.338 06:14:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:57.338 06:14:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:57.338 06:14:26 -- common/autotest_common.sh@10 -- # set +x 00:04:57.338 ************************************ 00:04:57.338 START TEST rpc 00:04:57.338 ************************************ 00:04:57.338 06:14:26 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:04:57.597 * Looking for test storage... 00:04:57.597 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:57.597 06:14:26 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:57.597 06:14:26 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:57.597 06:14:26 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:57.597 06:14:26 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:57.597 06:14:26 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:57.597 06:14:26 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:57.597 06:14:26 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:57.597 06:14:26 -- scripts/common.sh@335 -- # IFS=.-: 00:04:57.597 06:14:26 -- scripts/common.sh@335 -- # read -ra ver1 00:04:57.598 06:14:26 -- scripts/common.sh@336 -- # IFS=.-: 00:04:57.598 06:14:26 -- scripts/common.sh@336 -- # read -ra ver2 00:04:57.598 06:14:26 -- scripts/common.sh@337 -- # local 'op=<' 00:04:57.598 06:14:26 -- scripts/common.sh@339 -- # ver1_l=2 00:04:57.598 06:14:26 -- scripts/common.sh@340 -- # ver2_l=1 00:04:57.598 06:14:26 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:57.598 06:14:26 -- scripts/common.sh@343 -- # case "$op" in 00:04:57.598 06:14:26 -- scripts/common.sh@344 -- # : 1 00:04:57.598 06:14:26 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:57.598 06:14:26 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:57.598 06:14:26 -- scripts/common.sh@364 -- # decimal 1 00:04:57.598 06:14:26 -- scripts/common.sh@352 -- # local d=1 00:04:57.598 06:14:26 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:57.598 06:14:26 -- scripts/common.sh@354 -- # echo 1 00:04:57.598 06:14:26 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:57.598 06:14:26 -- scripts/common.sh@365 -- # decimal 2 00:04:57.598 06:14:26 -- scripts/common.sh@352 -- # local d=2 00:04:57.598 06:14:26 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:57.598 06:14:26 -- scripts/common.sh@354 -- # echo 2 00:04:57.598 06:14:26 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:57.598 06:14:26 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:57.598 06:14:26 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:57.598 06:14:26 -- scripts/common.sh@367 -- # return 0 00:04:57.598 06:14:26 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:57.598 06:14:26 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:57.598 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:57.598 --rc genhtml_branch_coverage=1 00:04:57.598 --rc genhtml_function_coverage=1 00:04:57.598 --rc genhtml_legend=1 00:04:57.598 --rc geninfo_all_blocks=1 00:04:57.598 --rc geninfo_unexecuted_blocks=1 00:04:57.598 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:57.598 ' 00:04:57.598 06:14:26 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:57.598 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:57.598 --rc genhtml_branch_coverage=1 00:04:57.598 --rc genhtml_function_coverage=1 00:04:57.598 --rc genhtml_legend=1 00:04:57.598 --rc geninfo_all_blocks=1 00:04:57.598 --rc geninfo_unexecuted_blocks=1 00:04:57.598 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:57.598 ' 00:04:57.598 06:14:26 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:57.598 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:57.598 --rc genhtml_branch_coverage=1 00:04:57.598 --rc genhtml_function_coverage=1 00:04:57.598 --rc genhtml_legend=1 00:04:57.598 --rc geninfo_all_blocks=1 00:04:57.598 --rc geninfo_unexecuted_blocks=1 00:04:57.598 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:57.598 ' 00:04:57.598 06:14:26 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:57.598 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:57.598 --rc genhtml_branch_coverage=1 00:04:57.598 --rc genhtml_function_coverage=1 00:04:57.598 --rc genhtml_legend=1 00:04:57.598 --rc geninfo_all_blocks=1 00:04:57.598 --rc geninfo_unexecuted_blocks=1 00:04:57.598 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:57.598 ' 00:04:57.598 06:14:26 -- rpc/rpc.sh@65 -- # spdk_pid=5272 00:04:57.598 06:14:26 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:04:57.598 06:14:26 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:04:57.598 06:14:26 -- rpc/rpc.sh@67 -- # waitforlisten 5272 00:04:57.598 06:14:26 -- common/autotest_common.sh@829 -- # '[' -z 5272 ']' 00:04:57.598 06:14:26 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:57.598 06:14:26 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:57.598 06:14:26 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:57.598 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:57.598 06:14:26 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:57.598 06:14:26 -- common/autotest_common.sh@10 -- # set +x 00:04:57.598 [2024-11-27 06:14:27.009401] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:57.598 [2024-11-27 06:14:27.009483] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid5272 ] 00:04:57.598 EAL: No free 2048 kB hugepages reported on node 1 00:04:57.598 [2024-11-27 06:14:27.073318] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:57.857 [2024-11-27 06:14:27.149297] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:57.857 [2024-11-27 06:14:27.149415] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:04:57.857 [2024-11-27 06:14:27.149427] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 5272' to capture a snapshot of events at runtime. 00:04:57.857 [2024-11-27 06:14:27.149436] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid5272 for offline analysis/debug. 00:04:57.857 [2024-11-27 06:14:27.149455] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:58.426 06:14:27 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:58.426 06:14:27 -- common/autotest_common.sh@862 -- # return 0 00:04:58.426 06:14:27 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:58.426 06:14:27 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:58.426 06:14:27 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:04:58.426 06:14:27 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:04:58.426 06:14:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:58.426 06:14:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:58.426 06:14:27 -- common/autotest_common.sh@10 -- # set +x 00:04:58.426 ************************************ 00:04:58.426 START TEST rpc_integrity 00:04:58.426 ************************************ 00:04:58.426 06:14:27 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:04:58.426 06:14:27 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:04:58.426 06:14:27 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.426 06:14:27 -- common/autotest_common.sh@10 -- # set +x 00:04:58.426 06:14:27 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.426 06:14:27 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:04:58.426 06:14:27 -- rpc/rpc.sh@13 -- # jq length 00:04:58.426 06:14:27 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:04:58.426 06:14:27 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:04:58.426 06:14:27 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.426 06:14:27 -- common/autotest_common.sh@10 -- # set +x 00:04:58.426 06:14:27 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.426 06:14:27 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:04:58.426 06:14:27 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:04:58.426 06:14:27 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.426 06:14:27 -- common/autotest_common.sh@10 -- # set +x 00:04:58.426 06:14:27 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.426 06:14:27 -- rpc/rpc.sh@16 -- # bdevs='[ 00:04:58.426 { 00:04:58.426 "name": "Malloc0", 00:04:58.426 "aliases": [ 00:04:58.426 "8fa6643d-f59d-4c86-a26e-bdd892bb3143" 00:04:58.426 ], 00:04:58.426 "product_name": "Malloc disk", 00:04:58.426 "block_size": 512, 00:04:58.426 "num_blocks": 16384, 00:04:58.426 "uuid": "8fa6643d-f59d-4c86-a26e-bdd892bb3143", 00:04:58.426 "assigned_rate_limits": { 00:04:58.426 "rw_ios_per_sec": 0, 00:04:58.426 "rw_mbytes_per_sec": 0, 00:04:58.426 "r_mbytes_per_sec": 0, 00:04:58.426 "w_mbytes_per_sec": 0 00:04:58.426 }, 00:04:58.426 "claimed": false, 00:04:58.426 "zoned": false, 00:04:58.426 "supported_io_types": { 00:04:58.426 "read": true, 00:04:58.426 "write": true, 00:04:58.426 "unmap": true, 00:04:58.426 "write_zeroes": true, 00:04:58.426 "flush": true, 00:04:58.426 "reset": true, 00:04:58.426 "compare": false, 00:04:58.426 "compare_and_write": false, 00:04:58.426 "abort": true, 00:04:58.426 "nvme_admin": false, 00:04:58.426 "nvme_io": false 00:04:58.426 }, 00:04:58.426 "memory_domains": [ 00:04:58.426 { 00:04:58.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:58.426 "dma_device_type": 2 00:04:58.426 } 00:04:58.426 ], 00:04:58.426 "driver_specific": {} 00:04:58.426 } 00:04:58.426 ]' 00:04:58.426 06:14:27 -- rpc/rpc.sh@17 -- # jq length 00:04:58.686 06:14:27 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:04:58.686 06:14:27 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:04:58.686 06:14:27 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.686 06:14:27 -- common/autotest_common.sh@10 -- # set +x 00:04:58.686 [2024-11-27 06:14:27.990133] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:04:58.686 [2024-11-27 06:14:27.990169] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:58.686 [2024-11-27 06:14:27.990190] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5f2c030 00:04:58.686 [2024-11-27 06:14:27.990201] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:58.686 [2024-11-27 06:14:27.991040] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:58.686 [2024-11-27 06:14:27.991064] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:04:58.686 Passthru0 00:04:58.686 06:14:27 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.686 06:14:27 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:04:58.686 06:14:27 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.686 06:14:27 -- common/autotest_common.sh@10 -- # set +x 00:04:58.686 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.686 06:14:28 -- rpc/rpc.sh@20 -- # bdevs='[ 00:04:58.686 { 00:04:58.686 "name": "Malloc0", 00:04:58.686 "aliases": [ 00:04:58.686 "8fa6643d-f59d-4c86-a26e-bdd892bb3143" 00:04:58.686 ], 00:04:58.686 "product_name": "Malloc disk", 00:04:58.686 "block_size": 512, 00:04:58.686 "num_blocks": 16384, 00:04:58.686 "uuid": "8fa6643d-f59d-4c86-a26e-bdd892bb3143", 00:04:58.686 "assigned_rate_limits": { 00:04:58.686 "rw_ios_per_sec": 0, 00:04:58.686 "rw_mbytes_per_sec": 0, 00:04:58.686 "r_mbytes_per_sec": 0, 00:04:58.686 "w_mbytes_per_sec": 0 00:04:58.686 }, 00:04:58.686 "claimed": true, 00:04:58.686 "claim_type": "exclusive_write", 00:04:58.686 "zoned": false, 00:04:58.686 "supported_io_types": { 00:04:58.686 "read": true, 00:04:58.686 "write": true, 00:04:58.686 "unmap": true, 00:04:58.686 "write_zeroes": true, 00:04:58.686 "flush": true, 00:04:58.686 "reset": true, 00:04:58.686 "compare": false, 00:04:58.686 "compare_and_write": false, 00:04:58.686 "abort": true, 00:04:58.686 "nvme_admin": false, 00:04:58.686 "nvme_io": false 00:04:58.686 }, 00:04:58.686 "memory_domains": [ 00:04:58.686 { 00:04:58.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:58.686 "dma_device_type": 2 00:04:58.686 } 00:04:58.686 ], 00:04:58.686 "driver_specific": {} 00:04:58.686 }, 00:04:58.686 { 00:04:58.686 "name": "Passthru0", 00:04:58.686 "aliases": [ 00:04:58.686 "a405e648-f0ac-5361-ba62-557f2ba11542" 00:04:58.686 ], 00:04:58.686 "product_name": "passthru", 00:04:58.686 "block_size": 512, 00:04:58.686 "num_blocks": 16384, 00:04:58.686 "uuid": "a405e648-f0ac-5361-ba62-557f2ba11542", 00:04:58.686 "assigned_rate_limits": { 00:04:58.686 "rw_ios_per_sec": 0, 00:04:58.686 "rw_mbytes_per_sec": 0, 00:04:58.686 "r_mbytes_per_sec": 0, 00:04:58.686 "w_mbytes_per_sec": 0 00:04:58.686 }, 00:04:58.686 "claimed": false, 00:04:58.686 "zoned": false, 00:04:58.686 "supported_io_types": { 00:04:58.686 "read": true, 00:04:58.686 "write": true, 00:04:58.686 "unmap": true, 00:04:58.686 "write_zeroes": true, 00:04:58.686 "flush": true, 00:04:58.686 "reset": true, 00:04:58.686 "compare": false, 00:04:58.686 "compare_and_write": false, 00:04:58.686 "abort": true, 00:04:58.686 "nvme_admin": false, 00:04:58.686 "nvme_io": false 00:04:58.686 }, 00:04:58.686 "memory_domains": [ 00:04:58.686 { 00:04:58.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:58.686 "dma_device_type": 2 00:04:58.686 } 00:04:58.686 ], 00:04:58.686 "driver_specific": { 00:04:58.686 "passthru": { 00:04:58.686 "name": "Passthru0", 00:04:58.686 "base_bdev_name": "Malloc0" 00:04:58.686 } 00:04:58.686 } 00:04:58.686 } 00:04:58.686 ]' 00:04:58.686 06:14:28 -- rpc/rpc.sh@21 -- # jq length 00:04:58.686 06:14:28 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:04:58.686 06:14:28 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:04:58.686 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.686 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:58.686 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.686 06:14:28 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:04:58.686 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.686 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:58.686 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.686 06:14:28 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:04:58.686 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.686 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:58.686 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.686 06:14:28 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:04:58.686 06:14:28 -- rpc/rpc.sh@26 -- # jq length 00:04:58.686 06:14:28 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:04:58.686 00:04:58.686 real 0m0.290s 00:04:58.686 user 0m0.180s 00:04:58.686 sys 0m0.043s 00:04:58.686 06:14:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:58.686 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:58.686 ************************************ 00:04:58.686 END TEST rpc_integrity 00:04:58.686 ************************************ 00:04:58.686 06:14:28 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:04:58.686 06:14:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:58.686 06:14:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:58.687 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:58.687 ************************************ 00:04:58.687 START TEST rpc_plugins 00:04:58.687 ************************************ 00:04:58.687 06:14:28 -- common/autotest_common.sh@1114 -- # rpc_plugins 00:04:58.687 06:14:28 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:04:58.687 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.687 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:58.687 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.687 06:14:28 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:04:58.687 06:14:28 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:04:58.687 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.687 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:58.946 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.946 06:14:28 -- rpc/rpc.sh@31 -- # bdevs='[ 00:04:58.946 { 00:04:58.946 "name": "Malloc1", 00:04:58.946 "aliases": [ 00:04:58.946 "530ae756-e729-4033-a18d-eb1b9c41cce3" 00:04:58.946 ], 00:04:58.946 "product_name": "Malloc disk", 00:04:58.946 "block_size": 4096, 00:04:58.946 "num_blocks": 256, 00:04:58.946 "uuid": "530ae756-e729-4033-a18d-eb1b9c41cce3", 00:04:58.946 "assigned_rate_limits": { 00:04:58.946 "rw_ios_per_sec": 0, 00:04:58.946 "rw_mbytes_per_sec": 0, 00:04:58.946 "r_mbytes_per_sec": 0, 00:04:58.946 "w_mbytes_per_sec": 0 00:04:58.946 }, 00:04:58.946 "claimed": false, 00:04:58.946 "zoned": false, 00:04:58.946 "supported_io_types": { 00:04:58.946 "read": true, 00:04:58.946 "write": true, 00:04:58.946 "unmap": true, 00:04:58.946 "write_zeroes": true, 00:04:58.946 "flush": true, 00:04:58.946 "reset": true, 00:04:58.946 "compare": false, 00:04:58.946 "compare_and_write": false, 00:04:58.946 "abort": true, 00:04:58.946 "nvme_admin": false, 00:04:58.946 "nvme_io": false 00:04:58.946 }, 00:04:58.946 "memory_domains": [ 00:04:58.946 { 00:04:58.946 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:58.946 "dma_device_type": 2 00:04:58.946 } 00:04:58.946 ], 00:04:58.946 "driver_specific": {} 00:04:58.946 } 00:04:58.946 ]' 00:04:58.946 06:14:28 -- rpc/rpc.sh@32 -- # jq length 00:04:58.946 06:14:28 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:04:58.946 06:14:28 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:04:58.946 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.946 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:58.946 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.946 06:14:28 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:04:58.946 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.946 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:58.946 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.946 06:14:28 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:04:58.946 06:14:28 -- rpc/rpc.sh@36 -- # jq length 00:04:58.946 06:14:28 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:04:58.946 00:04:58.946 real 0m0.148s 00:04:58.946 user 0m0.088s 00:04:58.946 sys 0m0.023s 00:04:58.946 06:14:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:58.946 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:58.946 ************************************ 00:04:58.946 END TEST rpc_plugins 00:04:58.946 ************************************ 00:04:58.946 06:14:28 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:04:58.946 06:14:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:58.946 06:14:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:58.946 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:58.946 ************************************ 00:04:58.946 START TEST rpc_trace_cmd_test 00:04:58.946 ************************************ 00:04:58.946 06:14:28 -- common/autotest_common.sh@1114 -- # rpc_trace_cmd_test 00:04:58.946 06:14:28 -- rpc/rpc.sh@40 -- # local info 00:04:58.946 06:14:28 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:04:58.946 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:58.946 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:58.946 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:58.946 06:14:28 -- rpc/rpc.sh@42 -- # info='{ 00:04:58.946 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid5272", 00:04:58.946 "tpoint_group_mask": "0x8", 00:04:58.946 "iscsi_conn": { 00:04:58.946 "mask": "0x2", 00:04:58.946 "tpoint_mask": "0x0" 00:04:58.946 }, 00:04:58.946 "scsi": { 00:04:58.946 "mask": "0x4", 00:04:58.946 "tpoint_mask": "0x0" 00:04:58.946 }, 00:04:58.946 "bdev": { 00:04:58.946 "mask": "0x8", 00:04:58.946 "tpoint_mask": "0xffffffffffffffff" 00:04:58.946 }, 00:04:58.946 "nvmf_rdma": { 00:04:58.946 "mask": "0x10", 00:04:58.946 "tpoint_mask": "0x0" 00:04:58.946 }, 00:04:58.946 "nvmf_tcp": { 00:04:58.946 "mask": "0x20", 00:04:58.946 "tpoint_mask": "0x0" 00:04:58.946 }, 00:04:58.946 "ftl": { 00:04:58.946 "mask": "0x40", 00:04:58.946 "tpoint_mask": "0x0" 00:04:58.946 }, 00:04:58.946 "blobfs": { 00:04:58.946 "mask": "0x80", 00:04:58.946 "tpoint_mask": "0x0" 00:04:58.946 }, 00:04:58.946 "dsa": { 00:04:58.946 "mask": "0x200", 00:04:58.946 "tpoint_mask": "0x0" 00:04:58.946 }, 00:04:58.946 "thread": { 00:04:58.946 "mask": "0x400", 00:04:58.946 "tpoint_mask": "0x0" 00:04:58.946 }, 00:04:58.946 "nvme_pcie": { 00:04:58.946 "mask": "0x800", 00:04:58.946 "tpoint_mask": "0x0" 00:04:58.946 }, 00:04:58.946 "iaa": { 00:04:58.946 "mask": "0x1000", 00:04:58.946 "tpoint_mask": "0x0" 00:04:58.946 }, 00:04:58.946 "nvme_tcp": { 00:04:58.946 "mask": "0x2000", 00:04:58.946 "tpoint_mask": "0x0" 00:04:58.946 }, 00:04:58.946 "bdev_nvme": { 00:04:58.946 "mask": "0x4000", 00:04:58.946 "tpoint_mask": "0x0" 00:04:58.946 } 00:04:58.946 }' 00:04:58.946 06:14:28 -- rpc/rpc.sh@43 -- # jq length 00:04:58.946 06:14:28 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:04:58.946 06:14:28 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:04:58.946 06:14:28 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:04:58.946 06:14:28 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:04:59.206 06:14:28 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:04:59.206 06:14:28 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:04:59.206 06:14:28 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:04:59.206 06:14:28 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:04:59.206 06:14:28 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:04:59.206 00:04:59.206 real 0m0.215s 00:04:59.206 user 0m0.176s 00:04:59.206 sys 0m0.034s 00:04:59.206 06:14:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:59.206 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:59.206 ************************************ 00:04:59.206 END TEST rpc_trace_cmd_test 00:04:59.206 ************************************ 00:04:59.206 06:14:28 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:04:59.206 06:14:28 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:04:59.206 06:14:28 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:04:59.206 06:14:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:59.206 06:14:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:59.206 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:59.206 ************************************ 00:04:59.206 START TEST rpc_daemon_integrity 00:04:59.206 ************************************ 00:04:59.206 06:14:28 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:04:59.206 06:14:28 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:04:59.206 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:59.206 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:59.206 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:59.206 06:14:28 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:04:59.206 06:14:28 -- rpc/rpc.sh@13 -- # jq length 00:04:59.206 06:14:28 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:04:59.206 06:14:28 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:04:59.206 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:59.206 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:59.206 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:59.206 06:14:28 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:04:59.206 06:14:28 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:04:59.206 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:59.206 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:59.206 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:59.206 06:14:28 -- rpc/rpc.sh@16 -- # bdevs='[ 00:04:59.206 { 00:04:59.206 "name": "Malloc2", 00:04:59.206 "aliases": [ 00:04:59.206 "ccc9b184-2380-438b-b637-7862d346a438" 00:04:59.206 ], 00:04:59.206 "product_name": "Malloc disk", 00:04:59.206 "block_size": 512, 00:04:59.206 "num_blocks": 16384, 00:04:59.206 "uuid": "ccc9b184-2380-438b-b637-7862d346a438", 00:04:59.206 "assigned_rate_limits": { 00:04:59.206 "rw_ios_per_sec": 0, 00:04:59.206 "rw_mbytes_per_sec": 0, 00:04:59.206 "r_mbytes_per_sec": 0, 00:04:59.206 "w_mbytes_per_sec": 0 00:04:59.206 }, 00:04:59.206 "claimed": false, 00:04:59.206 "zoned": false, 00:04:59.206 "supported_io_types": { 00:04:59.206 "read": true, 00:04:59.206 "write": true, 00:04:59.206 "unmap": true, 00:04:59.206 "write_zeroes": true, 00:04:59.206 "flush": true, 00:04:59.206 "reset": true, 00:04:59.206 "compare": false, 00:04:59.206 "compare_and_write": false, 00:04:59.206 "abort": true, 00:04:59.206 "nvme_admin": false, 00:04:59.206 "nvme_io": false 00:04:59.206 }, 00:04:59.206 "memory_domains": [ 00:04:59.206 { 00:04:59.206 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:59.206 "dma_device_type": 2 00:04:59.206 } 00:04:59.206 ], 00:04:59.206 "driver_specific": {} 00:04:59.206 } 00:04:59.206 ]' 00:04:59.206 06:14:28 -- rpc/rpc.sh@17 -- # jq length 00:04:59.465 06:14:28 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:04:59.465 06:14:28 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:04:59.465 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:59.465 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:59.465 [2024-11-27 06:14:28.776178] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:04:59.465 [2024-11-27 06:14:28.776210] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:59.465 [2024-11-27 06:14:28.776224] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x60b5980 00:04:59.465 [2024-11-27 06:14:28.776234] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:59.465 [2024-11-27 06:14:28.776946] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:59.465 [2024-11-27 06:14:28.776970] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:04:59.465 Passthru0 00:04:59.465 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:59.465 06:14:28 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:04:59.465 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:59.465 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:59.465 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:59.465 06:14:28 -- rpc/rpc.sh@20 -- # bdevs='[ 00:04:59.465 { 00:04:59.465 "name": "Malloc2", 00:04:59.465 "aliases": [ 00:04:59.465 "ccc9b184-2380-438b-b637-7862d346a438" 00:04:59.465 ], 00:04:59.465 "product_name": "Malloc disk", 00:04:59.465 "block_size": 512, 00:04:59.465 "num_blocks": 16384, 00:04:59.465 "uuid": "ccc9b184-2380-438b-b637-7862d346a438", 00:04:59.465 "assigned_rate_limits": { 00:04:59.465 "rw_ios_per_sec": 0, 00:04:59.465 "rw_mbytes_per_sec": 0, 00:04:59.465 "r_mbytes_per_sec": 0, 00:04:59.465 "w_mbytes_per_sec": 0 00:04:59.465 }, 00:04:59.465 "claimed": true, 00:04:59.465 "claim_type": "exclusive_write", 00:04:59.465 "zoned": false, 00:04:59.465 "supported_io_types": { 00:04:59.465 "read": true, 00:04:59.465 "write": true, 00:04:59.465 "unmap": true, 00:04:59.465 "write_zeroes": true, 00:04:59.465 "flush": true, 00:04:59.465 "reset": true, 00:04:59.465 "compare": false, 00:04:59.465 "compare_and_write": false, 00:04:59.465 "abort": true, 00:04:59.465 "nvme_admin": false, 00:04:59.465 "nvme_io": false 00:04:59.465 }, 00:04:59.465 "memory_domains": [ 00:04:59.465 { 00:04:59.465 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:59.465 "dma_device_type": 2 00:04:59.465 } 00:04:59.465 ], 00:04:59.465 "driver_specific": {} 00:04:59.465 }, 00:04:59.465 { 00:04:59.465 "name": "Passthru0", 00:04:59.465 "aliases": [ 00:04:59.465 "857fdd43-5ccd-5a4f-a1e5-4a5cf6f1b9c6" 00:04:59.465 ], 00:04:59.465 "product_name": "passthru", 00:04:59.465 "block_size": 512, 00:04:59.465 "num_blocks": 16384, 00:04:59.465 "uuid": "857fdd43-5ccd-5a4f-a1e5-4a5cf6f1b9c6", 00:04:59.465 "assigned_rate_limits": { 00:04:59.465 "rw_ios_per_sec": 0, 00:04:59.465 "rw_mbytes_per_sec": 0, 00:04:59.466 "r_mbytes_per_sec": 0, 00:04:59.466 "w_mbytes_per_sec": 0 00:04:59.466 }, 00:04:59.466 "claimed": false, 00:04:59.466 "zoned": false, 00:04:59.466 "supported_io_types": { 00:04:59.466 "read": true, 00:04:59.466 "write": true, 00:04:59.466 "unmap": true, 00:04:59.466 "write_zeroes": true, 00:04:59.466 "flush": true, 00:04:59.466 "reset": true, 00:04:59.466 "compare": false, 00:04:59.466 "compare_and_write": false, 00:04:59.466 "abort": true, 00:04:59.466 "nvme_admin": false, 00:04:59.466 "nvme_io": false 00:04:59.466 }, 00:04:59.466 "memory_domains": [ 00:04:59.466 { 00:04:59.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:59.466 "dma_device_type": 2 00:04:59.466 } 00:04:59.466 ], 00:04:59.466 "driver_specific": { 00:04:59.466 "passthru": { 00:04:59.466 "name": "Passthru0", 00:04:59.466 "base_bdev_name": "Malloc2" 00:04:59.466 } 00:04:59.466 } 00:04:59.466 } 00:04:59.466 ]' 00:04:59.466 06:14:28 -- rpc/rpc.sh@21 -- # jq length 00:04:59.466 06:14:28 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:04:59.466 06:14:28 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:04:59.466 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:59.466 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:59.466 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:59.466 06:14:28 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:04:59.466 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:59.466 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:59.466 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:59.466 06:14:28 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:04:59.466 06:14:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:59.466 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:59.466 06:14:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:59.466 06:14:28 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:04:59.466 06:14:28 -- rpc/rpc.sh@26 -- # jq length 00:04:59.466 06:14:28 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:04:59.466 00:04:59.466 real 0m0.278s 00:04:59.466 user 0m0.169s 00:04:59.466 sys 0m0.047s 00:04:59.466 06:14:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:59.466 06:14:28 -- common/autotest_common.sh@10 -- # set +x 00:04:59.466 ************************************ 00:04:59.466 END TEST rpc_daemon_integrity 00:04:59.466 ************************************ 00:04:59.466 06:14:28 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:04:59.466 06:14:28 -- rpc/rpc.sh@84 -- # killprocess 5272 00:04:59.466 06:14:28 -- common/autotest_common.sh@936 -- # '[' -z 5272 ']' 00:04:59.466 06:14:28 -- common/autotest_common.sh@940 -- # kill -0 5272 00:04:59.466 06:14:28 -- common/autotest_common.sh@941 -- # uname 00:04:59.466 06:14:28 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:59.466 06:14:28 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 5272 00:04:59.739 06:14:29 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:59.739 06:14:29 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:59.739 06:14:29 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 5272' 00:04:59.739 killing process with pid 5272 00:04:59.739 06:14:29 -- common/autotest_common.sh@955 -- # kill 5272 00:04:59.739 06:14:29 -- common/autotest_common.sh@960 -- # wait 5272 00:05:00.008 00:05:00.008 real 0m2.526s 00:05:00.008 user 0m3.198s 00:05:00.008 sys 0m0.729s 00:05:00.008 06:14:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:00.008 06:14:29 -- common/autotest_common.sh@10 -- # set +x 00:05:00.008 ************************************ 00:05:00.008 END TEST rpc 00:05:00.008 ************************************ 00:05:00.008 06:14:29 -- spdk/autotest.sh@164 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:00.008 06:14:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:00.008 06:14:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:00.008 06:14:29 -- common/autotest_common.sh@10 -- # set +x 00:05:00.008 ************************************ 00:05:00.008 START TEST rpc_client 00:05:00.008 ************************************ 00:05:00.008 06:14:29 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:00.008 * Looking for test storage... 00:05:00.008 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:00.008 06:14:29 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:00.008 06:14:29 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:00.008 06:14:29 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:00.268 06:14:29 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:00.268 06:14:29 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:00.268 06:14:29 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:00.268 06:14:29 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:00.268 06:14:29 -- scripts/common.sh@335 -- # IFS=.-: 00:05:00.268 06:14:29 -- scripts/common.sh@335 -- # read -ra ver1 00:05:00.268 06:14:29 -- scripts/common.sh@336 -- # IFS=.-: 00:05:00.268 06:14:29 -- scripts/common.sh@336 -- # read -ra ver2 00:05:00.268 06:14:29 -- scripts/common.sh@337 -- # local 'op=<' 00:05:00.268 06:14:29 -- scripts/common.sh@339 -- # ver1_l=2 00:05:00.268 06:14:29 -- scripts/common.sh@340 -- # ver2_l=1 00:05:00.268 06:14:29 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:00.268 06:14:29 -- scripts/common.sh@343 -- # case "$op" in 00:05:00.268 06:14:29 -- scripts/common.sh@344 -- # : 1 00:05:00.268 06:14:29 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:00.268 06:14:29 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:00.268 06:14:29 -- scripts/common.sh@364 -- # decimal 1 00:05:00.268 06:14:29 -- scripts/common.sh@352 -- # local d=1 00:05:00.268 06:14:29 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:00.268 06:14:29 -- scripts/common.sh@354 -- # echo 1 00:05:00.268 06:14:29 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:00.268 06:14:29 -- scripts/common.sh@365 -- # decimal 2 00:05:00.268 06:14:29 -- scripts/common.sh@352 -- # local d=2 00:05:00.268 06:14:29 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:00.268 06:14:29 -- scripts/common.sh@354 -- # echo 2 00:05:00.268 06:14:29 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:00.268 06:14:29 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:00.268 06:14:29 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:00.268 06:14:29 -- scripts/common.sh@367 -- # return 0 00:05:00.268 06:14:29 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:00.268 06:14:29 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:00.268 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.268 --rc genhtml_branch_coverage=1 00:05:00.268 --rc genhtml_function_coverage=1 00:05:00.268 --rc genhtml_legend=1 00:05:00.268 --rc geninfo_all_blocks=1 00:05:00.268 --rc geninfo_unexecuted_blocks=1 00:05:00.268 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:00.268 ' 00:05:00.268 06:14:29 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:00.268 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.268 --rc genhtml_branch_coverage=1 00:05:00.268 --rc genhtml_function_coverage=1 00:05:00.268 --rc genhtml_legend=1 00:05:00.268 --rc geninfo_all_blocks=1 00:05:00.268 --rc geninfo_unexecuted_blocks=1 00:05:00.268 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:00.268 ' 00:05:00.268 06:14:29 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:00.269 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.269 --rc genhtml_branch_coverage=1 00:05:00.269 --rc genhtml_function_coverage=1 00:05:00.269 --rc genhtml_legend=1 00:05:00.269 --rc geninfo_all_blocks=1 00:05:00.269 --rc geninfo_unexecuted_blocks=1 00:05:00.269 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:00.269 ' 00:05:00.269 06:14:29 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:00.269 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.269 --rc genhtml_branch_coverage=1 00:05:00.269 --rc genhtml_function_coverage=1 00:05:00.269 --rc genhtml_legend=1 00:05:00.269 --rc geninfo_all_blocks=1 00:05:00.269 --rc geninfo_unexecuted_blocks=1 00:05:00.269 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:00.269 ' 00:05:00.269 06:14:29 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:00.269 OK 00:05:00.269 06:14:29 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:00.269 00:05:00.269 real 0m0.202s 00:05:00.269 user 0m0.111s 00:05:00.269 sys 0m0.104s 00:05:00.269 06:14:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:00.269 06:14:29 -- common/autotest_common.sh@10 -- # set +x 00:05:00.269 ************************************ 00:05:00.269 END TEST rpc_client 00:05:00.269 ************************************ 00:05:00.269 06:14:29 -- spdk/autotest.sh@165 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:00.269 06:14:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:00.269 06:14:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:00.269 06:14:29 -- common/autotest_common.sh@10 -- # set +x 00:05:00.269 ************************************ 00:05:00.269 START TEST json_config 00:05:00.269 ************************************ 00:05:00.269 06:14:29 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:00.269 06:14:29 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:00.269 06:14:29 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:00.269 06:14:29 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:00.269 06:14:29 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:00.269 06:14:29 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:00.269 06:14:29 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:00.269 06:14:29 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:00.269 06:14:29 -- scripts/common.sh@335 -- # IFS=.-: 00:05:00.269 06:14:29 -- scripts/common.sh@335 -- # read -ra ver1 00:05:00.269 06:14:29 -- scripts/common.sh@336 -- # IFS=.-: 00:05:00.269 06:14:29 -- scripts/common.sh@336 -- # read -ra ver2 00:05:00.269 06:14:29 -- scripts/common.sh@337 -- # local 'op=<' 00:05:00.269 06:14:29 -- scripts/common.sh@339 -- # ver1_l=2 00:05:00.269 06:14:29 -- scripts/common.sh@340 -- # ver2_l=1 00:05:00.269 06:14:29 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:00.269 06:14:29 -- scripts/common.sh@343 -- # case "$op" in 00:05:00.269 06:14:29 -- scripts/common.sh@344 -- # : 1 00:05:00.269 06:14:29 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:00.269 06:14:29 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:00.269 06:14:29 -- scripts/common.sh@364 -- # decimal 1 00:05:00.269 06:14:29 -- scripts/common.sh@352 -- # local d=1 00:05:00.269 06:14:29 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:00.269 06:14:29 -- scripts/common.sh@354 -- # echo 1 00:05:00.269 06:14:29 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:00.269 06:14:29 -- scripts/common.sh@365 -- # decimal 2 00:05:00.269 06:14:29 -- scripts/common.sh@352 -- # local d=2 00:05:00.269 06:14:29 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:00.269 06:14:29 -- scripts/common.sh@354 -- # echo 2 00:05:00.269 06:14:29 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:00.269 06:14:29 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:00.269 06:14:29 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:00.269 06:14:29 -- scripts/common.sh@367 -- # return 0 00:05:00.269 06:14:29 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:00.269 06:14:29 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:00.269 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.269 --rc genhtml_branch_coverage=1 00:05:00.269 --rc genhtml_function_coverage=1 00:05:00.269 --rc genhtml_legend=1 00:05:00.269 --rc geninfo_all_blocks=1 00:05:00.269 --rc geninfo_unexecuted_blocks=1 00:05:00.269 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:00.269 ' 00:05:00.269 06:14:29 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:00.269 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.269 --rc genhtml_branch_coverage=1 00:05:00.269 --rc genhtml_function_coverage=1 00:05:00.269 --rc genhtml_legend=1 00:05:00.269 --rc geninfo_all_blocks=1 00:05:00.269 --rc geninfo_unexecuted_blocks=1 00:05:00.269 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:00.269 ' 00:05:00.269 06:14:29 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:00.269 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.269 --rc genhtml_branch_coverage=1 00:05:00.269 --rc genhtml_function_coverage=1 00:05:00.269 --rc genhtml_legend=1 00:05:00.269 --rc geninfo_all_blocks=1 00:05:00.269 --rc geninfo_unexecuted_blocks=1 00:05:00.269 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:00.269 ' 00:05:00.269 06:14:29 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:00.269 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.269 --rc genhtml_branch_coverage=1 00:05:00.269 --rc genhtml_function_coverage=1 00:05:00.269 --rc genhtml_legend=1 00:05:00.269 --rc geninfo_all_blocks=1 00:05:00.269 --rc geninfo_unexecuted_blocks=1 00:05:00.269 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:00.269 ' 00:05:00.269 06:14:29 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:00.269 06:14:29 -- nvmf/common.sh@7 -- # uname -s 00:05:00.269 06:14:29 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:00.269 06:14:29 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:00.269 06:14:29 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:00.269 06:14:29 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:00.269 06:14:29 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:00.269 06:14:29 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:00.269 06:14:29 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:00.269 06:14:29 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:00.269 06:14:29 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:00.269 06:14:29 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:00.269 06:14:29 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:00.269 06:14:29 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:00.269 06:14:29 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:00.269 06:14:29 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:00.269 06:14:29 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:00.269 06:14:29 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:00.269 06:14:29 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:00.269 06:14:29 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:00.269 06:14:29 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:00.269 06:14:29 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:00.269 06:14:29 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:00.269 06:14:29 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:00.269 06:14:29 -- paths/export.sh@5 -- # export PATH 00:05:00.269 06:14:29 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:00.269 06:14:29 -- nvmf/common.sh@46 -- # : 0 00:05:00.269 06:14:29 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:00.269 06:14:29 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:00.269 06:14:29 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:00.269 06:14:29 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:00.269 06:14:29 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:00.269 06:14:29 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:00.269 06:14:29 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:00.269 06:14:29 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:00.269 06:14:29 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:00.269 06:14:29 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:00.269 06:14:29 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:00.269 06:14:29 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:00.269 06:14:29 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:00.269 WARNING: No tests are enabled so not running JSON configuration tests 00:05:00.269 06:14:29 -- json_config/json_config.sh@27 -- # exit 0 00:05:00.269 00:05:00.269 real 0m0.145s 00:05:00.269 user 0m0.078s 00:05:00.269 sys 0m0.071s 00:05:00.269 06:14:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:00.269 06:14:29 -- common/autotest_common.sh@10 -- # set +x 00:05:00.269 ************************************ 00:05:00.269 END TEST json_config 00:05:00.269 ************************************ 00:05:00.529 06:14:29 -- spdk/autotest.sh@166 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:00.529 06:14:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:00.529 06:14:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:00.529 06:14:29 -- common/autotest_common.sh@10 -- # set +x 00:05:00.529 ************************************ 00:05:00.529 START TEST json_config_extra_key 00:05:00.529 ************************************ 00:05:00.529 06:14:29 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:00.529 06:14:29 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:00.529 06:14:29 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:00.529 06:14:29 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:00.529 06:14:29 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:00.529 06:14:29 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:00.529 06:14:29 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:00.529 06:14:29 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:00.529 06:14:29 -- scripts/common.sh@335 -- # IFS=.-: 00:05:00.529 06:14:29 -- scripts/common.sh@335 -- # read -ra ver1 00:05:00.529 06:14:29 -- scripts/common.sh@336 -- # IFS=.-: 00:05:00.529 06:14:29 -- scripts/common.sh@336 -- # read -ra ver2 00:05:00.529 06:14:29 -- scripts/common.sh@337 -- # local 'op=<' 00:05:00.529 06:14:29 -- scripts/common.sh@339 -- # ver1_l=2 00:05:00.529 06:14:29 -- scripts/common.sh@340 -- # ver2_l=1 00:05:00.529 06:14:29 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:00.529 06:14:29 -- scripts/common.sh@343 -- # case "$op" in 00:05:00.529 06:14:29 -- scripts/common.sh@344 -- # : 1 00:05:00.530 06:14:29 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:00.530 06:14:29 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:00.530 06:14:29 -- scripts/common.sh@364 -- # decimal 1 00:05:00.530 06:14:29 -- scripts/common.sh@352 -- # local d=1 00:05:00.530 06:14:29 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:00.530 06:14:29 -- scripts/common.sh@354 -- # echo 1 00:05:00.530 06:14:29 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:00.530 06:14:29 -- scripts/common.sh@365 -- # decimal 2 00:05:00.530 06:14:29 -- scripts/common.sh@352 -- # local d=2 00:05:00.530 06:14:29 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:00.530 06:14:29 -- scripts/common.sh@354 -- # echo 2 00:05:00.530 06:14:29 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:00.530 06:14:29 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:00.530 06:14:29 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:00.530 06:14:29 -- scripts/common.sh@367 -- # return 0 00:05:00.530 06:14:29 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:00.530 06:14:29 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:00.530 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.530 --rc genhtml_branch_coverage=1 00:05:00.530 --rc genhtml_function_coverage=1 00:05:00.530 --rc genhtml_legend=1 00:05:00.530 --rc geninfo_all_blocks=1 00:05:00.530 --rc geninfo_unexecuted_blocks=1 00:05:00.530 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:00.530 ' 00:05:00.530 06:14:29 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:00.530 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.530 --rc genhtml_branch_coverage=1 00:05:00.530 --rc genhtml_function_coverage=1 00:05:00.530 --rc genhtml_legend=1 00:05:00.530 --rc geninfo_all_blocks=1 00:05:00.530 --rc geninfo_unexecuted_blocks=1 00:05:00.530 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:00.530 ' 00:05:00.530 06:14:29 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:00.530 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.530 --rc genhtml_branch_coverage=1 00:05:00.530 --rc genhtml_function_coverage=1 00:05:00.530 --rc genhtml_legend=1 00:05:00.530 --rc geninfo_all_blocks=1 00:05:00.530 --rc geninfo_unexecuted_blocks=1 00:05:00.530 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:00.530 ' 00:05:00.530 06:14:29 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:00.530 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.530 --rc genhtml_branch_coverage=1 00:05:00.530 --rc genhtml_function_coverage=1 00:05:00.530 --rc genhtml_legend=1 00:05:00.530 --rc geninfo_all_blocks=1 00:05:00.530 --rc geninfo_unexecuted_blocks=1 00:05:00.530 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:00.530 ' 00:05:00.530 06:14:29 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:00.530 06:14:29 -- nvmf/common.sh@7 -- # uname -s 00:05:00.530 06:14:29 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:00.530 06:14:29 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:00.530 06:14:29 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:00.530 06:14:29 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:00.530 06:14:29 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:00.530 06:14:29 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:00.530 06:14:29 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:00.530 06:14:29 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:00.530 06:14:29 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:00.530 06:14:29 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:00.530 06:14:30 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:00.530 06:14:30 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:00.530 06:14:30 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:00.530 06:14:30 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:00.530 06:14:30 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:00.530 06:14:30 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:00.530 06:14:30 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:00.530 06:14:30 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:00.530 06:14:30 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:00.530 06:14:30 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:00.530 06:14:30 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:00.530 06:14:30 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:00.530 06:14:30 -- paths/export.sh@5 -- # export PATH 00:05:00.530 06:14:30 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:00.530 06:14:30 -- nvmf/common.sh@46 -- # : 0 00:05:00.530 06:14:30 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:00.530 06:14:30 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:00.530 06:14:30 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:00.530 06:14:30 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:00.530 06:14:30 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:00.530 06:14:30 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:00.530 06:14:30 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:00.530 06:14:30 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:00.530 INFO: launching applications... 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=6072 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:00.530 Waiting for target to run... 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 6072 /var/tmp/spdk_tgt.sock 00:05:00.530 06:14:30 -- common/autotest_common.sh@829 -- # '[' -z 6072 ']' 00:05:00.530 06:14:30 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:00.530 06:14:30 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:00.530 06:14:30 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:00.530 06:14:30 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:00.530 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:00.530 06:14:30 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:00.530 06:14:30 -- common/autotest_common.sh@10 -- # set +x 00:05:00.530 [2024-11-27 06:14:30.046369] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:00.530 [2024-11-27 06:14:30.046458] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid6072 ] 00:05:00.790 EAL: No free 2048 kB hugepages reported on node 1 00:05:01.050 [2024-11-27 06:14:30.484367] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:01.050 [2024-11-27 06:14:30.574711] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:01.050 [2024-11-27 06:14:30.574822] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:01.618 06:14:30 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:01.618 06:14:30 -- common/autotest_common.sh@862 -- # return 0 00:05:01.618 06:14:30 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:01.618 00:05:01.618 06:14:30 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:01.618 INFO: shutting down applications... 00:05:01.618 06:14:30 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:01.618 06:14:30 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:01.618 06:14:30 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:01.618 06:14:30 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 6072 ]] 00:05:01.618 06:14:30 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 6072 00:05:01.618 06:14:30 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:01.618 06:14:30 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:01.618 06:14:30 -- json_config/json_config_extra_key.sh@50 -- # kill -0 6072 00:05:01.618 06:14:30 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:01.877 06:14:31 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:01.877 06:14:31 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:01.877 06:14:31 -- json_config/json_config_extra_key.sh@50 -- # kill -0 6072 00:05:01.877 06:14:31 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:01.877 06:14:31 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:01.877 06:14:31 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:01.877 06:14:31 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:01.877 SPDK target shutdown done 00:05:01.877 06:14:31 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:01.877 Success 00:05:01.877 00:05:01.877 real 0m1.565s 00:05:01.877 user 0m1.148s 00:05:01.877 sys 0m0.582s 00:05:01.877 06:14:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:01.877 06:14:31 -- common/autotest_common.sh@10 -- # set +x 00:05:01.877 ************************************ 00:05:01.877 END TEST json_config_extra_key 00:05:01.877 ************************************ 00:05:02.135 06:14:31 -- spdk/autotest.sh@167 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:02.135 06:14:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:02.135 06:14:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:02.135 06:14:31 -- common/autotest_common.sh@10 -- # set +x 00:05:02.135 ************************************ 00:05:02.135 START TEST alias_rpc 00:05:02.135 ************************************ 00:05:02.135 06:14:31 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:02.135 * Looking for test storage... 00:05:02.135 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:02.136 06:14:31 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:02.136 06:14:31 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:02.136 06:14:31 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:02.136 06:14:31 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:02.136 06:14:31 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:02.136 06:14:31 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:02.136 06:14:31 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:02.136 06:14:31 -- scripts/common.sh@335 -- # IFS=.-: 00:05:02.136 06:14:31 -- scripts/common.sh@335 -- # read -ra ver1 00:05:02.136 06:14:31 -- scripts/common.sh@336 -- # IFS=.-: 00:05:02.136 06:14:31 -- scripts/common.sh@336 -- # read -ra ver2 00:05:02.136 06:14:31 -- scripts/common.sh@337 -- # local 'op=<' 00:05:02.136 06:14:31 -- scripts/common.sh@339 -- # ver1_l=2 00:05:02.136 06:14:31 -- scripts/common.sh@340 -- # ver2_l=1 00:05:02.136 06:14:31 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:02.136 06:14:31 -- scripts/common.sh@343 -- # case "$op" in 00:05:02.136 06:14:31 -- scripts/common.sh@344 -- # : 1 00:05:02.136 06:14:31 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:02.136 06:14:31 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:02.136 06:14:31 -- scripts/common.sh@364 -- # decimal 1 00:05:02.136 06:14:31 -- scripts/common.sh@352 -- # local d=1 00:05:02.136 06:14:31 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:02.136 06:14:31 -- scripts/common.sh@354 -- # echo 1 00:05:02.136 06:14:31 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:02.136 06:14:31 -- scripts/common.sh@365 -- # decimal 2 00:05:02.136 06:14:31 -- scripts/common.sh@352 -- # local d=2 00:05:02.136 06:14:31 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:02.136 06:14:31 -- scripts/common.sh@354 -- # echo 2 00:05:02.136 06:14:31 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:02.136 06:14:31 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:02.136 06:14:31 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:02.136 06:14:31 -- scripts/common.sh@367 -- # return 0 00:05:02.136 06:14:31 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:02.136 06:14:31 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:02.136 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:02.136 --rc genhtml_branch_coverage=1 00:05:02.136 --rc genhtml_function_coverage=1 00:05:02.136 --rc genhtml_legend=1 00:05:02.136 --rc geninfo_all_blocks=1 00:05:02.136 --rc geninfo_unexecuted_blocks=1 00:05:02.136 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:02.136 ' 00:05:02.136 06:14:31 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:02.136 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:02.136 --rc genhtml_branch_coverage=1 00:05:02.136 --rc genhtml_function_coverage=1 00:05:02.136 --rc genhtml_legend=1 00:05:02.136 --rc geninfo_all_blocks=1 00:05:02.136 --rc geninfo_unexecuted_blocks=1 00:05:02.136 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:02.136 ' 00:05:02.136 06:14:31 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:02.136 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:02.136 --rc genhtml_branch_coverage=1 00:05:02.136 --rc genhtml_function_coverage=1 00:05:02.136 --rc genhtml_legend=1 00:05:02.136 --rc geninfo_all_blocks=1 00:05:02.136 --rc geninfo_unexecuted_blocks=1 00:05:02.136 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:02.136 ' 00:05:02.136 06:14:31 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:02.136 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:02.136 --rc genhtml_branch_coverage=1 00:05:02.136 --rc genhtml_function_coverage=1 00:05:02.136 --rc genhtml_legend=1 00:05:02.136 --rc geninfo_all_blocks=1 00:05:02.136 --rc geninfo_unexecuted_blocks=1 00:05:02.136 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:02.136 ' 00:05:02.136 06:14:31 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:02.136 06:14:31 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=6397 00:05:02.136 06:14:31 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 6397 00:05:02.136 06:14:31 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:02.136 06:14:31 -- common/autotest_common.sh@829 -- # '[' -z 6397 ']' 00:05:02.136 06:14:31 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:02.136 06:14:31 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:02.136 06:14:31 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:02.136 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:02.136 06:14:31 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:02.136 06:14:31 -- common/autotest_common.sh@10 -- # set +x 00:05:02.136 [2024-11-27 06:14:31.650926] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:02.136 [2024-11-27 06:14:31.650995] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid6397 ] 00:05:02.395 EAL: No free 2048 kB hugepages reported on node 1 00:05:02.395 [2024-11-27 06:14:31.717277] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:02.395 [2024-11-27 06:14:31.786371] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:02.395 [2024-11-27 06:14:31.786479] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:02.963 06:14:32 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:02.963 06:14:32 -- common/autotest_common.sh@862 -- # return 0 00:05:02.963 06:14:32 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:03.223 06:14:32 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 6397 00:05:03.223 06:14:32 -- common/autotest_common.sh@936 -- # '[' -z 6397 ']' 00:05:03.223 06:14:32 -- common/autotest_common.sh@940 -- # kill -0 6397 00:05:03.223 06:14:32 -- common/autotest_common.sh@941 -- # uname 00:05:03.223 06:14:32 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:03.223 06:14:32 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 6397 00:05:03.223 06:14:32 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:03.223 06:14:32 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:03.223 06:14:32 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 6397' 00:05:03.223 killing process with pid 6397 00:05:03.223 06:14:32 -- common/autotest_common.sh@955 -- # kill 6397 00:05:03.223 06:14:32 -- common/autotest_common.sh@960 -- # wait 6397 00:05:03.793 00:05:03.793 real 0m1.594s 00:05:03.793 user 0m1.704s 00:05:03.793 sys 0m0.455s 00:05:03.793 06:14:33 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:03.793 06:14:33 -- common/autotest_common.sh@10 -- # set +x 00:05:03.793 ************************************ 00:05:03.793 END TEST alias_rpc 00:05:03.793 ************************************ 00:05:03.793 06:14:33 -- spdk/autotest.sh@169 -- # [[ 0 -eq 0 ]] 00:05:03.793 06:14:33 -- spdk/autotest.sh@170 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:03.793 06:14:33 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:03.793 06:14:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:03.793 06:14:33 -- common/autotest_common.sh@10 -- # set +x 00:05:03.793 ************************************ 00:05:03.793 START TEST spdkcli_tcp 00:05:03.793 ************************************ 00:05:03.793 06:14:33 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:03.793 * Looking for test storage... 00:05:03.793 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:03.793 06:14:33 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:03.793 06:14:33 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:03.793 06:14:33 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:03.793 06:14:33 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:03.793 06:14:33 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:03.793 06:14:33 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:03.793 06:14:33 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:03.793 06:14:33 -- scripts/common.sh@335 -- # IFS=.-: 00:05:03.793 06:14:33 -- scripts/common.sh@335 -- # read -ra ver1 00:05:03.793 06:14:33 -- scripts/common.sh@336 -- # IFS=.-: 00:05:03.793 06:14:33 -- scripts/common.sh@336 -- # read -ra ver2 00:05:03.793 06:14:33 -- scripts/common.sh@337 -- # local 'op=<' 00:05:03.793 06:14:33 -- scripts/common.sh@339 -- # ver1_l=2 00:05:03.793 06:14:33 -- scripts/common.sh@340 -- # ver2_l=1 00:05:03.793 06:14:33 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:03.793 06:14:33 -- scripts/common.sh@343 -- # case "$op" in 00:05:03.793 06:14:33 -- scripts/common.sh@344 -- # : 1 00:05:03.793 06:14:33 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:03.793 06:14:33 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:03.793 06:14:33 -- scripts/common.sh@364 -- # decimal 1 00:05:03.793 06:14:33 -- scripts/common.sh@352 -- # local d=1 00:05:03.793 06:14:33 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:03.793 06:14:33 -- scripts/common.sh@354 -- # echo 1 00:05:03.793 06:14:33 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:03.793 06:14:33 -- scripts/common.sh@365 -- # decimal 2 00:05:03.793 06:14:33 -- scripts/common.sh@352 -- # local d=2 00:05:03.793 06:14:33 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:03.793 06:14:33 -- scripts/common.sh@354 -- # echo 2 00:05:03.793 06:14:33 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:03.793 06:14:33 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:03.793 06:14:33 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:03.793 06:14:33 -- scripts/common.sh@367 -- # return 0 00:05:03.793 06:14:33 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:03.793 06:14:33 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:03.793 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:03.793 --rc genhtml_branch_coverage=1 00:05:03.793 --rc genhtml_function_coverage=1 00:05:03.793 --rc genhtml_legend=1 00:05:03.793 --rc geninfo_all_blocks=1 00:05:03.793 --rc geninfo_unexecuted_blocks=1 00:05:03.793 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:03.793 ' 00:05:03.793 06:14:33 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:03.793 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:03.793 --rc genhtml_branch_coverage=1 00:05:03.793 --rc genhtml_function_coverage=1 00:05:03.793 --rc genhtml_legend=1 00:05:03.793 --rc geninfo_all_blocks=1 00:05:03.793 --rc geninfo_unexecuted_blocks=1 00:05:03.793 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:03.793 ' 00:05:03.793 06:14:33 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:03.793 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:03.793 --rc genhtml_branch_coverage=1 00:05:03.793 --rc genhtml_function_coverage=1 00:05:03.793 --rc genhtml_legend=1 00:05:03.793 --rc geninfo_all_blocks=1 00:05:03.793 --rc geninfo_unexecuted_blocks=1 00:05:03.793 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:03.793 ' 00:05:03.793 06:14:33 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:03.793 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:03.793 --rc genhtml_branch_coverage=1 00:05:03.793 --rc genhtml_function_coverage=1 00:05:03.793 --rc genhtml_legend=1 00:05:03.793 --rc geninfo_all_blocks=1 00:05:03.793 --rc geninfo_unexecuted_blocks=1 00:05:03.793 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:03.793 ' 00:05:03.793 06:14:33 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:03.793 06:14:33 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:03.793 06:14:33 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:03.793 06:14:33 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:03.793 06:14:33 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:03.793 06:14:33 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:03.793 06:14:33 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:03.793 06:14:33 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:03.793 06:14:33 -- common/autotest_common.sh@10 -- # set +x 00:05:03.793 06:14:33 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=6731 00:05:03.793 06:14:33 -- spdkcli/tcp.sh@27 -- # waitforlisten 6731 00:05:03.793 06:14:33 -- common/autotest_common.sh@829 -- # '[' -z 6731 ']' 00:05:03.793 06:14:33 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:03.793 06:14:33 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:03.793 06:14:33 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:03.793 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:03.793 06:14:33 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:03.793 06:14:33 -- common/autotest_common.sh@10 -- # set +x 00:05:03.793 06:14:33 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:03.793 [2024-11-27 06:14:33.295341] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:03.793 [2024-11-27 06:14:33.295434] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid6731 ] 00:05:04.053 EAL: No free 2048 kB hugepages reported on node 1 00:05:04.053 [2024-11-27 06:14:33.364150] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:04.053 [2024-11-27 06:14:33.439034] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:04.053 [2024-11-27 06:14:33.439172] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:04.053 [2024-11-27 06:14:33.439174] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:04.619 06:14:34 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:04.619 06:14:34 -- common/autotest_common.sh@862 -- # return 0 00:05:04.619 06:14:34 -- spdkcli/tcp.sh@31 -- # socat_pid=6973 00:05:04.619 06:14:34 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:04.619 06:14:34 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:04.878 [ 00:05:04.878 "spdk_get_version", 00:05:04.878 "rpc_get_methods", 00:05:04.878 "trace_get_info", 00:05:04.878 "trace_get_tpoint_group_mask", 00:05:04.878 "trace_disable_tpoint_group", 00:05:04.878 "trace_enable_tpoint_group", 00:05:04.878 "trace_clear_tpoint_mask", 00:05:04.878 "trace_set_tpoint_mask", 00:05:04.878 "vfu_tgt_set_base_path", 00:05:04.878 "framework_get_pci_devices", 00:05:04.878 "framework_get_config", 00:05:04.878 "framework_get_subsystems", 00:05:04.878 "iobuf_get_stats", 00:05:04.878 "iobuf_set_options", 00:05:04.878 "sock_set_default_impl", 00:05:04.878 "sock_impl_set_options", 00:05:04.878 "sock_impl_get_options", 00:05:04.878 "vmd_rescan", 00:05:04.878 "vmd_remove_device", 00:05:04.878 "vmd_enable", 00:05:04.878 "accel_get_stats", 00:05:04.878 "accel_set_options", 00:05:04.878 "accel_set_driver", 00:05:04.878 "accel_crypto_key_destroy", 00:05:04.878 "accel_crypto_keys_get", 00:05:04.878 "accel_crypto_key_create", 00:05:04.878 "accel_assign_opc", 00:05:04.878 "accel_get_module_info", 00:05:04.878 "accel_get_opc_assignments", 00:05:04.878 "notify_get_notifications", 00:05:04.878 "notify_get_types", 00:05:04.878 "bdev_get_histogram", 00:05:04.878 "bdev_enable_histogram", 00:05:04.878 "bdev_set_qos_limit", 00:05:04.878 "bdev_set_qd_sampling_period", 00:05:04.878 "bdev_get_bdevs", 00:05:04.878 "bdev_reset_iostat", 00:05:04.878 "bdev_get_iostat", 00:05:04.878 "bdev_examine", 00:05:04.878 "bdev_wait_for_examine", 00:05:04.878 "bdev_set_options", 00:05:04.878 "scsi_get_devices", 00:05:04.878 "thread_set_cpumask", 00:05:04.878 "framework_get_scheduler", 00:05:04.878 "framework_set_scheduler", 00:05:04.878 "framework_get_reactors", 00:05:04.878 "thread_get_io_channels", 00:05:04.878 "thread_get_pollers", 00:05:04.878 "thread_get_stats", 00:05:04.878 "framework_monitor_context_switch", 00:05:04.878 "spdk_kill_instance", 00:05:04.878 "log_enable_timestamps", 00:05:04.878 "log_get_flags", 00:05:04.878 "log_clear_flag", 00:05:04.878 "log_set_flag", 00:05:04.878 "log_get_level", 00:05:04.878 "log_set_level", 00:05:04.878 "log_get_print_level", 00:05:04.878 "log_set_print_level", 00:05:04.878 "framework_enable_cpumask_locks", 00:05:04.878 "framework_disable_cpumask_locks", 00:05:04.878 "framework_wait_init", 00:05:04.878 "framework_start_init", 00:05:04.878 "virtio_blk_create_transport", 00:05:04.878 "virtio_blk_get_transports", 00:05:04.878 "vhost_controller_set_coalescing", 00:05:04.878 "vhost_get_controllers", 00:05:04.878 "vhost_delete_controller", 00:05:04.878 "vhost_create_blk_controller", 00:05:04.878 "vhost_scsi_controller_remove_target", 00:05:04.878 "vhost_scsi_controller_add_target", 00:05:04.878 "vhost_start_scsi_controller", 00:05:04.878 "vhost_create_scsi_controller", 00:05:04.878 "ublk_recover_disk", 00:05:04.878 "ublk_get_disks", 00:05:04.878 "ublk_stop_disk", 00:05:04.878 "ublk_start_disk", 00:05:04.878 "ublk_destroy_target", 00:05:04.878 "ublk_create_target", 00:05:04.878 "nbd_get_disks", 00:05:04.878 "nbd_stop_disk", 00:05:04.878 "nbd_start_disk", 00:05:04.878 "env_dpdk_get_mem_stats", 00:05:04.878 "nvmf_subsystem_get_listeners", 00:05:04.878 "nvmf_subsystem_get_qpairs", 00:05:04.878 "nvmf_subsystem_get_controllers", 00:05:04.878 "nvmf_get_stats", 00:05:04.878 "nvmf_get_transports", 00:05:04.878 "nvmf_create_transport", 00:05:04.878 "nvmf_get_targets", 00:05:04.878 "nvmf_delete_target", 00:05:04.878 "nvmf_create_target", 00:05:04.878 "nvmf_subsystem_allow_any_host", 00:05:04.878 "nvmf_subsystem_remove_host", 00:05:04.878 "nvmf_subsystem_add_host", 00:05:04.878 "nvmf_subsystem_remove_ns", 00:05:04.878 "nvmf_subsystem_add_ns", 00:05:04.878 "nvmf_subsystem_listener_set_ana_state", 00:05:04.878 "nvmf_discovery_get_referrals", 00:05:04.878 "nvmf_discovery_remove_referral", 00:05:04.878 "nvmf_discovery_add_referral", 00:05:04.878 "nvmf_subsystem_remove_listener", 00:05:04.878 "nvmf_subsystem_add_listener", 00:05:04.878 "nvmf_delete_subsystem", 00:05:04.878 "nvmf_create_subsystem", 00:05:04.878 "nvmf_get_subsystems", 00:05:04.878 "nvmf_set_crdt", 00:05:04.878 "nvmf_set_config", 00:05:04.878 "nvmf_set_max_subsystems", 00:05:04.878 "iscsi_set_options", 00:05:04.878 "iscsi_get_auth_groups", 00:05:04.878 "iscsi_auth_group_remove_secret", 00:05:04.878 "iscsi_auth_group_add_secret", 00:05:04.878 "iscsi_delete_auth_group", 00:05:04.878 "iscsi_create_auth_group", 00:05:04.878 "iscsi_set_discovery_auth", 00:05:04.878 "iscsi_get_options", 00:05:04.878 "iscsi_target_node_request_logout", 00:05:04.878 "iscsi_target_node_set_redirect", 00:05:04.878 "iscsi_target_node_set_auth", 00:05:04.878 "iscsi_target_node_add_lun", 00:05:04.878 "iscsi_get_connections", 00:05:04.878 "iscsi_portal_group_set_auth", 00:05:04.878 "iscsi_start_portal_group", 00:05:04.878 "iscsi_delete_portal_group", 00:05:04.878 "iscsi_create_portal_group", 00:05:04.878 "iscsi_get_portal_groups", 00:05:04.878 "iscsi_delete_target_node", 00:05:04.878 "iscsi_target_node_remove_pg_ig_maps", 00:05:04.878 "iscsi_target_node_add_pg_ig_maps", 00:05:04.878 "iscsi_create_target_node", 00:05:04.878 "iscsi_get_target_nodes", 00:05:04.878 "iscsi_delete_initiator_group", 00:05:04.878 "iscsi_initiator_group_remove_initiators", 00:05:04.878 "iscsi_initiator_group_add_initiators", 00:05:04.878 "iscsi_create_initiator_group", 00:05:04.878 "iscsi_get_initiator_groups", 00:05:04.878 "vfu_virtio_create_scsi_endpoint", 00:05:04.878 "vfu_virtio_scsi_remove_target", 00:05:04.878 "vfu_virtio_scsi_add_target", 00:05:04.878 "vfu_virtio_create_blk_endpoint", 00:05:04.878 "vfu_virtio_delete_endpoint", 00:05:04.878 "iaa_scan_accel_module", 00:05:04.878 "dsa_scan_accel_module", 00:05:04.878 "ioat_scan_accel_module", 00:05:04.878 "accel_error_inject_error", 00:05:04.878 "bdev_iscsi_delete", 00:05:04.878 "bdev_iscsi_create", 00:05:04.878 "bdev_iscsi_set_options", 00:05:04.878 "bdev_virtio_attach_controller", 00:05:04.878 "bdev_virtio_scsi_get_devices", 00:05:04.878 "bdev_virtio_detach_controller", 00:05:04.878 "bdev_virtio_blk_set_hotplug", 00:05:04.878 "bdev_ftl_set_property", 00:05:04.878 "bdev_ftl_get_properties", 00:05:04.878 "bdev_ftl_get_stats", 00:05:04.878 "bdev_ftl_unmap", 00:05:04.878 "bdev_ftl_unload", 00:05:04.878 "bdev_ftl_delete", 00:05:04.878 "bdev_ftl_load", 00:05:04.878 "bdev_ftl_create", 00:05:04.878 "bdev_aio_delete", 00:05:04.878 "bdev_aio_rescan", 00:05:04.878 "bdev_aio_create", 00:05:04.878 "blobfs_create", 00:05:04.878 "blobfs_detect", 00:05:04.878 "blobfs_set_cache_size", 00:05:04.878 "bdev_zone_block_delete", 00:05:04.878 "bdev_zone_block_create", 00:05:04.878 "bdev_delay_delete", 00:05:04.878 "bdev_delay_create", 00:05:04.878 "bdev_delay_update_latency", 00:05:04.878 "bdev_split_delete", 00:05:04.878 "bdev_split_create", 00:05:04.878 "bdev_error_inject_error", 00:05:04.878 "bdev_error_delete", 00:05:04.878 "bdev_error_create", 00:05:04.878 "bdev_raid_set_options", 00:05:04.878 "bdev_raid_remove_base_bdev", 00:05:04.878 "bdev_raid_add_base_bdev", 00:05:04.878 "bdev_raid_delete", 00:05:04.878 "bdev_raid_create", 00:05:04.878 "bdev_raid_get_bdevs", 00:05:04.878 "bdev_lvol_grow_lvstore", 00:05:04.878 "bdev_lvol_get_lvols", 00:05:04.878 "bdev_lvol_get_lvstores", 00:05:04.878 "bdev_lvol_delete", 00:05:04.878 "bdev_lvol_set_read_only", 00:05:04.878 "bdev_lvol_resize", 00:05:04.878 "bdev_lvol_decouple_parent", 00:05:04.878 "bdev_lvol_inflate", 00:05:04.878 "bdev_lvol_rename", 00:05:04.878 "bdev_lvol_clone_bdev", 00:05:04.878 "bdev_lvol_clone", 00:05:04.878 "bdev_lvol_snapshot", 00:05:04.878 "bdev_lvol_create", 00:05:04.878 "bdev_lvol_delete_lvstore", 00:05:04.878 "bdev_lvol_rename_lvstore", 00:05:04.878 "bdev_lvol_create_lvstore", 00:05:04.878 "bdev_passthru_delete", 00:05:04.878 "bdev_passthru_create", 00:05:04.878 "bdev_nvme_cuse_unregister", 00:05:04.878 "bdev_nvme_cuse_register", 00:05:04.878 "bdev_opal_new_user", 00:05:04.878 "bdev_opal_set_lock_state", 00:05:04.878 "bdev_opal_delete", 00:05:04.878 "bdev_opal_get_info", 00:05:04.878 "bdev_opal_create", 00:05:04.878 "bdev_nvme_opal_revert", 00:05:04.878 "bdev_nvme_opal_init", 00:05:04.878 "bdev_nvme_send_cmd", 00:05:04.878 "bdev_nvme_get_path_iostat", 00:05:04.878 "bdev_nvme_get_mdns_discovery_info", 00:05:04.878 "bdev_nvme_stop_mdns_discovery", 00:05:04.878 "bdev_nvme_start_mdns_discovery", 00:05:04.878 "bdev_nvme_set_multipath_policy", 00:05:04.878 "bdev_nvme_set_preferred_path", 00:05:04.878 "bdev_nvme_get_io_paths", 00:05:04.878 "bdev_nvme_remove_error_injection", 00:05:04.878 "bdev_nvme_add_error_injection", 00:05:04.878 "bdev_nvme_get_discovery_info", 00:05:04.878 "bdev_nvme_stop_discovery", 00:05:04.878 "bdev_nvme_start_discovery", 00:05:04.878 "bdev_nvme_get_controller_health_info", 00:05:04.878 "bdev_nvme_disable_controller", 00:05:04.878 "bdev_nvme_enable_controller", 00:05:04.878 "bdev_nvme_reset_controller", 00:05:04.878 "bdev_nvme_get_transport_statistics", 00:05:04.878 "bdev_nvme_apply_firmware", 00:05:04.878 "bdev_nvme_detach_controller", 00:05:04.878 "bdev_nvme_get_controllers", 00:05:04.878 "bdev_nvme_attach_controller", 00:05:04.878 "bdev_nvme_set_hotplug", 00:05:04.878 "bdev_nvme_set_options", 00:05:04.878 "bdev_null_resize", 00:05:04.878 "bdev_null_delete", 00:05:04.878 "bdev_null_create", 00:05:04.878 "bdev_malloc_delete", 00:05:04.878 "bdev_malloc_create" 00:05:04.878 ] 00:05:04.878 06:14:34 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:04.878 06:14:34 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:04.878 06:14:34 -- common/autotest_common.sh@10 -- # set +x 00:05:04.878 06:14:34 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:04.878 06:14:34 -- spdkcli/tcp.sh@38 -- # killprocess 6731 00:05:04.878 06:14:34 -- common/autotest_common.sh@936 -- # '[' -z 6731 ']' 00:05:04.878 06:14:34 -- common/autotest_common.sh@940 -- # kill -0 6731 00:05:04.878 06:14:34 -- common/autotest_common.sh@941 -- # uname 00:05:04.878 06:14:34 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:04.878 06:14:34 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 6731 00:05:04.878 06:14:34 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:04.878 06:14:34 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:05.137 06:14:34 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 6731' 00:05:05.137 killing process with pid 6731 00:05:05.137 06:14:34 -- common/autotest_common.sh@955 -- # kill 6731 00:05:05.137 06:14:34 -- common/autotest_common.sh@960 -- # wait 6731 00:05:05.397 00:05:05.397 real 0m1.624s 00:05:05.397 user 0m2.923s 00:05:05.397 sys 0m0.536s 00:05:05.397 06:14:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:05.397 06:14:34 -- common/autotest_common.sh@10 -- # set +x 00:05:05.397 ************************************ 00:05:05.397 END TEST spdkcli_tcp 00:05:05.397 ************************************ 00:05:05.397 06:14:34 -- spdk/autotest.sh@173 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:05.397 06:14:34 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:05.397 06:14:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:05.397 06:14:34 -- common/autotest_common.sh@10 -- # set +x 00:05:05.397 ************************************ 00:05:05.397 START TEST dpdk_mem_utility 00:05:05.397 ************************************ 00:05:05.397 06:14:34 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:05.397 * Looking for test storage... 00:05:05.397 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:05.397 06:14:34 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:05.397 06:14:34 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:05.397 06:14:34 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:05.397 06:14:34 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:05.397 06:14:34 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:05.397 06:14:34 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:05.397 06:14:34 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:05.397 06:14:34 -- scripts/common.sh@335 -- # IFS=.-: 00:05:05.397 06:14:34 -- scripts/common.sh@335 -- # read -ra ver1 00:05:05.397 06:14:34 -- scripts/common.sh@336 -- # IFS=.-: 00:05:05.397 06:14:34 -- scripts/common.sh@336 -- # read -ra ver2 00:05:05.397 06:14:34 -- scripts/common.sh@337 -- # local 'op=<' 00:05:05.397 06:14:34 -- scripts/common.sh@339 -- # ver1_l=2 00:05:05.397 06:14:34 -- scripts/common.sh@340 -- # ver2_l=1 00:05:05.397 06:14:34 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:05.397 06:14:34 -- scripts/common.sh@343 -- # case "$op" in 00:05:05.397 06:14:34 -- scripts/common.sh@344 -- # : 1 00:05:05.397 06:14:34 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:05.397 06:14:34 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:05.397 06:14:34 -- scripts/common.sh@364 -- # decimal 1 00:05:05.657 06:14:34 -- scripts/common.sh@352 -- # local d=1 00:05:05.657 06:14:34 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:05.657 06:14:34 -- scripts/common.sh@354 -- # echo 1 00:05:05.657 06:14:34 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:05.657 06:14:34 -- scripts/common.sh@365 -- # decimal 2 00:05:05.657 06:14:34 -- scripts/common.sh@352 -- # local d=2 00:05:05.657 06:14:34 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:05.657 06:14:34 -- scripts/common.sh@354 -- # echo 2 00:05:05.657 06:14:34 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:05.657 06:14:34 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:05.657 06:14:34 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:05.657 06:14:34 -- scripts/common.sh@367 -- # return 0 00:05:05.657 06:14:34 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:05.657 06:14:34 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:05.657 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:05.657 --rc genhtml_branch_coverage=1 00:05:05.657 --rc genhtml_function_coverage=1 00:05:05.657 --rc genhtml_legend=1 00:05:05.657 --rc geninfo_all_blocks=1 00:05:05.657 --rc geninfo_unexecuted_blocks=1 00:05:05.657 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:05.657 ' 00:05:05.657 06:14:34 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:05.657 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:05.657 --rc genhtml_branch_coverage=1 00:05:05.657 --rc genhtml_function_coverage=1 00:05:05.657 --rc genhtml_legend=1 00:05:05.657 --rc geninfo_all_blocks=1 00:05:05.657 --rc geninfo_unexecuted_blocks=1 00:05:05.657 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:05.657 ' 00:05:05.657 06:14:34 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:05.657 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:05.657 --rc genhtml_branch_coverage=1 00:05:05.657 --rc genhtml_function_coverage=1 00:05:05.657 --rc genhtml_legend=1 00:05:05.657 --rc geninfo_all_blocks=1 00:05:05.657 --rc geninfo_unexecuted_blocks=1 00:05:05.657 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:05.657 ' 00:05:05.657 06:14:34 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:05.657 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:05.657 --rc genhtml_branch_coverage=1 00:05:05.657 --rc genhtml_function_coverage=1 00:05:05.657 --rc genhtml_legend=1 00:05:05.657 --rc geninfo_all_blocks=1 00:05:05.657 --rc geninfo_unexecuted_blocks=1 00:05:05.657 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:05.657 ' 00:05:05.657 06:14:34 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:05.657 06:14:34 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:05.657 06:14:34 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=7077 00:05:05.657 06:14:34 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 7077 00:05:05.657 06:14:34 -- common/autotest_common.sh@829 -- # '[' -z 7077 ']' 00:05:05.657 06:14:34 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:05.657 06:14:34 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:05.657 06:14:34 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:05.657 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:05.657 06:14:34 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:05.657 06:14:34 -- common/autotest_common.sh@10 -- # set +x 00:05:05.657 [2024-11-27 06:14:34.961441] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:05.657 [2024-11-27 06:14:34.961512] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid7077 ] 00:05:05.657 EAL: No free 2048 kB hugepages reported on node 1 00:05:05.657 [2024-11-27 06:14:35.027293] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:05.657 [2024-11-27 06:14:35.097309] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:05.657 [2024-11-27 06:14:35.097436] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:06.595 06:14:35 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:06.595 06:14:35 -- common/autotest_common.sh@862 -- # return 0 00:05:06.595 06:14:35 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:06.595 06:14:35 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:06.595 06:14:35 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:06.595 06:14:35 -- common/autotest_common.sh@10 -- # set +x 00:05:06.595 { 00:05:06.595 "filename": "/tmp/spdk_mem_dump.txt" 00:05:06.595 } 00:05:06.595 06:14:35 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:06.595 06:14:35 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:06.595 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:06.595 1 heaps totaling size 814.000000 MiB 00:05:06.595 size: 814.000000 MiB heap id: 0 00:05:06.595 end heaps---------- 00:05:06.595 8 mempools totaling size 598.116089 MiB 00:05:06.595 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:06.595 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:06.595 size: 84.521057 MiB name: bdev_io_7077 00:05:06.595 size: 51.011292 MiB name: evtpool_7077 00:05:06.595 size: 50.003479 MiB name: msgpool_7077 00:05:06.595 size: 21.763794 MiB name: PDU_Pool 00:05:06.595 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:06.595 size: 0.026123 MiB name: Session_Pool 00:05:06.595 end mempools------- 00:05:06.595 6 memzones totaling size 4.142822 MiB 00:05:06.595 size: 1.000366 MiB name: RG_ring_0_7077 00:05:06.595 size: 1.000366 MiB name: RG_ring_1_7077 00:05:06.595 size: 1.000366 MiB name: RG_ring_4_7077 00:05:06.595 size: 1.000366 MiB name: RG_ring_5_7077 00:05:06.595 size: 0.125366 MiB name: RG_ring_2_7077 00:05:06.595 size: 0.015991 MiB name: RG_ring_3_7077 00:05:06.595 end memzones------- 00:05:06.595 06:14:35 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:06.595 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:06.595 list of free elements. size: 12.519348 MiB 00:05:06.595 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:06.595 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:06.595 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:06.595 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:06.595 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:06.595 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:06.595 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:06.595 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:06.595 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:06.595 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:06.595 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:06.595 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:06.595 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:06.595 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:06.595 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:06.595 list of standard malloc elements. size: 199.218079 MiB 00:05:06.595 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:06.595 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:06.595 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:06.595 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:06.595 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:06.595 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:06.595 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:06.595 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:06.595 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:06.595 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:06.595 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:06.595 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:06.595 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:06.595 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:06.595 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:06.595 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:06.595 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:06.595 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:06.595 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:06.595 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:06.595 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:06.595 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:06.595 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:06.595 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:06.595 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:06.595 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:06.595 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:06.595 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:06.595 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:06.595 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:06.595 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:06.595 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:06.595 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:06.595 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:06.595 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:06.595 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:06.595 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:06.595 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:06.595 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:06.595 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:06.595 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:06.595 list of memzone associated elements. size: 602.262573 MiB 00:05:06.595 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:06.595 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:06.595 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:06.595 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:06.595 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:06.596 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_7077_0 00:05:06.596 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:06.596 associated memzone info: size: 48.002930 MiB name: MP_evtpool_7077_0 00:05:06.596 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:06.596 associated memzone info: size: 48.002930 MiB name: MP_msgpool_7077_0 00:05:06.596 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:06.596 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:06.596 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:06.596 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:06.596 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:06.596 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_7077 00:05:06.596 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:06.596 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_7077 00:05:06.596 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:06.596 associated memzone info: size: 1.007996 MiB name: MP_evtpool_7077 00:05:06.596 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:06.596 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:06.596 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:06.596 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:06.596 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:06.596 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:06.596 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:06.596 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:06.596 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:06.596 associated memzone info: size: 1.000366 MiB name: RG_ring_0_7077 00:05:06.596 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:06.596 associated memzone info: size: 1.000366 MiB name: RG_ring_1_7077 00:05:06.596 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:06.596 associated memzone info: size: 1.000366 MiB name: RG_ring_4_7077 00:05:06.596 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:06.596 associated memzone info: size: 1.000366 MiB name: RG_ring_5_7077 00:05:06.596 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:06.596 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_7077 00:05:06.596 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:06.596 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:06.596 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:06.596 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:06.596 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:06.596 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:06.596 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:06.596 associated memzone info: size: 0.125366 MiB name: RG_ring_2_7077 00:05:06.596 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:06.596 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:06.596 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:06.596 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:06.596 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:06.596 associated memzone info: size: 0.015991 MiB name: RG_ring_3_7077 00:05:06.596 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:06.596 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:06.596 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:06.596 associated memzone info: size: 0.000183 MiB name: MP_msgpool_7077 00:05:06.596 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:06.596 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_7077 00:05:06.596 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:06.596 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:06.596 06:14:35 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:06.596 06:14:35 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 7077 00:05:06.596 06:14:35 -- common/autotest_common.sh@936 -- # '[' -z 7077 ']' 00:05:06.596 06:14:35 -- common/autotest_common.sh@940 -- # kill -0 7077 00:05:06.596 06:14:35 -- common/autotest_common.sh@941 -- # uname 00:05:06.596 06:14:35 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:06.596 06:14:35 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 7077 00:05:06.596 06:14:35 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:06.596 06:14:35 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:06.596 06:14:35 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 7077' 00:05:06.596 killing process with pid 7077 00:05:06.596 06:14:35 -- common/autotest_common.sh@955 -- # kill 7077 00:05:06.596 06:14:35 -- common/autotest_common.sh@960 -- # wait 7077 00:05:06.855 00:05:06.855 real 0m1.522s 00:05:06.855 user 0m1.580s 00:05:06.855 sys 0m0.457s 00:05:06.855 06:14:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:06.855 06:14:36 -- common/autotest_common.sh@10 -- # set +x 00:05:06.855 ************************************ 00:05:06.855 END TEST dpdk_mem_utility 00:05:06.855 ************************************ 00:05:06.855 06:14:36 -- spdk/autotest.sh@174 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:06.855 06:14:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:06.855 06:14:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:06.855 06:14:36 -- common/autotest_common.sh@10 -- # set +x 00:05:06.855 ************************************ 00:05:06.855 START TEST event 00:05:06.855 ************************************ 00:05:06.855 06:14:36 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:07.114 * Looking for test storage... 00:05:07.114 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:07.114 06:14:36 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:07.114 06:14:36 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:07.114 06:14:36 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:07.114 06:14:36 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:07.114 06:14:36 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:07.114 06:14:36 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:07.114 06:14:36 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:07.114 06:14:36 -- scripts/common.sh@335 -- # IFS=.-: 00:05:07.114 06:14:36 -- scripts/common.sh@335 -- # read -ra ver1 00:05:07.114 06:14:36 -- scripts/common.sh@336 -- # IFS=.-: 00:05:07.114 06:14:36 -- scripts/common.sh@336 -- # read -ra ver2 00:05:07.114 06:14:36 -- scripts/common.sh@337 -- # local 'op=<' 00:05:07.114 06:14:36 -- scripts/common.sh@339 -- # ver1_l=2 00:05:07.114 06:14:36 -- scripts/common.sh@340 -- # ver2_l=1 00:05:07.114 06:14:36 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:07.114 06:14:36 -- scripts/common.sh@343 -- # case "$op" in 00:05:07.114 06:14:36 -- scripts/common.sh@344 -- # : 1 00:05:07.114 06:14:36 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:07.114 06:14:36 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:07.114 06:14:36 -- scripts/common.sh@364 -- # decimal 1 00:05:07.114 06:14:36 -- scripts/common.sh@352 -- # local d=1 00:05:07.114 06:14:36 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:07.114 06:14:36 -- scripts/common.sh@354 -- # echo 1 00:05:07.114 06:14:36 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:07.114 06:14:36 -- scripts/common.sh@365 -- # decimal 2 00:05:07.114 06:14:36 -- scripts/common.sh@352 -- # local d=2 00:05:07.114 06:14:36 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:07.114 06:14:36 -- scripts/common.sh@354 -- # echo 2 00:05:07.114 06:14:36 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:07.114 06:14:36 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:07.114 06:14:36 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:07.114 06:14:36 -- scripts/common.sh@367 -- # return 0 00:05:07.114 06:14:36 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:07.114 06:14:36 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:07.114 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:07.114 --rc genhtml_branch_coverage=1 00:05:07.114 --rc genhtml_function_coverage=1 00:05:07.114 --rc genhtml_legend=1 00:05:07.114 --rc geninfo_all_blocks=1 00:05:07.114 --rc geninfo_unexecuted_blocks=1 00:05:07.114 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:07.114 ' 00:05:07.114 06:14:36 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:07.114 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:07.114 --rc genhtml_branch_coverage=1 00:05:07.114 --rc genhtml_function_coverage=1 00:05:07.114 --rc genhtml_legend=1 00:05:07.114 --rc geninfo_all_blocks=1 00:05:07.114 --rc geninfo_unexecuted_blocks=1 00:05:07.114 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:07.114 ' 00:05:07.114 06:14:36 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:07.114 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:07.114 --rc genhtml_branch_coverage=1 00:05:07.114 --rc genhtml_function_coverage=1 00:05:07.114 --rc genhtml_legend=1 00:05:07.114 --rc geninfo_all_blocks=1 00:05:07.114 --rc geninfo_unexecuted_blocks=1 00:05:07.115 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:07.115 ' 00:05:07.115 06:14:36 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:07.115 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:07.115 --rc genhtml_branch_coverage=1 00:05:07.115 --rc genhtml_function_coverage=1 00:05:07.115 --rc genhtml_legend=1 00:05:07.115 --rc geninfo_all_blocks=1 00:05:07.115 --rc geninfo_unexecuted_blocks=1 00:05:07.115 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:07.115 ' 00:05:07.115 06:14:36 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:07.115 06:14:36 -- bdev/nbd_common.sh@6 -- # set -e 00:05:07.115 06:14:36 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:07.115 06:14:36 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:05:07.115 06:14:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:07.115 06:14:36 -- common/autotest_common.sh@10 -- # set +x 00:05:07.115 ************************************ 00:05:07.115 START TEST event_perf 00:05:07.115 ************************************ 00:05:07.115 06:14:36 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:07.115 Running I/O for 1 seconds...[2024-11-27 06:14:36.525832] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:07.115 [2024-11-27 06:14:36.525922] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid7409 ] 00:05:07.115 EAL: No free 2048 kB hugepages reported on node 1 00:05:07.115 [2024-11-27 06:14:36.595552] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:07.374 [2024-11-27 06:14:36.667927] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:07.374 [2024-11-27 06:14:36.668024] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:07.374 [2024-11-27 06:14:36.668110] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:07.374 [2024-11-27 06:14:36.668112] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:08.310 Running I/O for 1 seconds... 00:05:08.310 lcore 0: 192492 00:05:08.310 lcore 1: 192490 00:05:08.310 lcore 2: 192490 00:05:08.310 lcore 3: 192491 00:05:08.310 done. 00:05:08.310 00:05:08.310 real 0m1.223s 00:05:08.310 user 0m4.131s 00:05:08.310 sys 0m0.088s 00:05:08.310 06:14:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:08.310 06:14:37 -- common/autotest_common.sh@10 -- # set +x 00:05:08.310 ************************************ 00:05:08.310 END TEST event_perf 00:05:08.310 ************************************ 00:05:08.310 06:14:37 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:08.310 06:14:37 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:08.310 06:14:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:08.310 06:14:37 -- common/autotest_common.sh@10 -- # set +x 00:05:08.310 ************************************ 00:05:08.310 START TEST event_reactor 00:05:08.310 ************************************ 00:05:08.310 06:14:37 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:08.310 [2024-11-27 06:14:37.790626] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:08.310 [2024-11-27 06:14:37.790721] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid7696 ] 00:05:08.310 EAL: No free 2048 kB hugepages reported on node 1 00:05:08.569 [2024-11-27 06:14:37.860138] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:08.570 [2024-11-27 06:14:37.928540] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:09.507 test_start 00:05:09.507 oneshot 00:05:09.507 tick 100 00:05:09.507 tick 100 00:05:09.507 tick 250 00:05:09.507 tick 100 00:05:09.507 tick 100 00:05:09.507 tick 100 00:05:09.507 tick 250 00:05:09.507 tick 500 00:05:09.507 tick 100 00:05:09.507 tick 100 00:05:09.507 tick 250 00:05:09.507 tick 100 00:05:09.507 tick 100 00:05:09.507 test_end 00:05:09.507 00:05:09.507 real 0m1.218s 00:05:09.507 user 0m1.128s 00:05:09.507 sys 0m0.085s 00:05:09.507 06:14:38 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:09.507 06:14:38 -- common/autotest_common.sh@10 -- # set +x 00:05:09.507 ************************************ 00:05:09.507 END TEST event_reactor 00:05:09.507 ************************************ 00:05:09.507 06:14:39 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:09.507 06:14:39 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:09.507 06:14:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:09.507 06:14:39 -- common/autotest_common.sh@10 -- # set +x 00:05:09.507 ************************************ 00:05:09.507 START TEST event_reactor_perf 00:05:09.507 ************************************ 00:05:09.507 06:14:39 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:09.767 [2024-11-27 06:14:39.053399] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:09.767 [2024-11-27 06:14:39.053487] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid7978 ] 00:05:09.767 EAL: No free 2048 kB hugepages reported on node 1 00:05:09.767 [2024-11-27 06:14:39.122622] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:09.767 [2024-11-27 06:14:39.190272] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:11.145 test_start 00:05:11.145 test_end 00:05:11.145 Performance: 972357 events per second 00:05:11.145 00:05:11.145 real 0m1.218s 00:05:11.145 user 0m1.122s 00:05:11.145 sys 0m0.091s 00:05:11.145 06:14:40 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:11.145 06:14:40 -- common/autotest_common.sh@10 -- # set +x 00:05:11.145 ************************************ 00:05:11.145 END TEST event_reactor_perf 00:05:11.145 ************************************ 00:05:11.145 06:14:40 -- event/event.sh@49 -- # uname -s 00:05:11.145 06:14:40 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:11.145 06:14:40 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:11.145 06:14:40 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:11.145 06:14:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:11.145 06:14:40 -- common/autotest_common.sh@10 -- # set +x 00:05:11.145 ************************************ 00:05:11.145 START TEST event_scheduler 00:05:11.145 ************************************ 00:05:11.145 06:14:40 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:11.145 * Looking for test storage... 00:05:11.145 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:11.145 06:14:40 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:11.145 06:14:40 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:11.145 06:14:40 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:11.145 06:14:40 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:11.145 06:14:40 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:11.145 06:14:40 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:11.145 06:14:40 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:11.145 06:14:40 -- scripts/common.sh@335 -- # IFS=.-: 00:05:11.145 06:14:40 -- scripts/common.sh@335 -- # read -ra ver1 00:05:11.145 06:14:40 -- scripts/common.sh@336 -- # IFS=.-: 00:05:11.145 06:14:40 -- scripts/common.sh@336 -- # read -ra ver2 00:05:11.145 06:14:40 -- scripts/common.sh@337 -- # local 'op=<' 00:05:11.145 06:14:40 -- scripts/common.sh@339 -- # ver1_l=2 00:05:11.145 06:14:40 -- scripts/common.sh@340 -- # ver2_l=1 00:05:11.145 06:14:40 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:11.145 06:14:40 -- scripts/common.sh@343 -- # case "$op" in 00:05:11.145 06:14:40 -- scripts/common.sh@344 -- # : 1 00:05:11.145 06:14:40 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:11.145 06:14:40 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:11.145 06:14:40 -- scripts/common.sh@364 -- # decimal 1 00:05:11.145 06:14:40 -- scripts/common.sh@352 -- # local d=1 00:05:11.145 06:14:40 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:11.145 06:14:40 -- scripts/common.sh@354 -- # echo 1 00:05:11.145 06:14:40 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:11.145 06:14:40 -- scripts/common.sh@365 -- # decimal 2 00:05:11.145 06:14:40 -- scripts/common.sh@352 -- # local d=2 00:05:11.145 06:14:40 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:11.145 06:14:40 -- scripts/common.sh@354 -- # echo 2 00:05:11.145 06:14:40 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:11.145 06:14:40 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:11.145 06:14:40 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:11.145 06:14:40 -- scripts/common.sh@367 -- # return 0 00:05:11.145 06:14:40 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:11.145 06:14:40 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:11.145 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:11.145 --rc genhtml_branch_coverage=1 00:05:11.145 --rc genhtml_function_coverage=1 00:05:11.145 --rc genhtml_legend=1 00:05:11.145 --rc geninfo_all_blocks=1 00:05:11.145 --rc geninfo_unexecuted_blocks=1 00:05:11.145 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:11.145 ' 00:05:11.145 06:14:40 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:11.145 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:11.145 --rc genhtml_branch_coverage=1 00:05:11.145 --rc genhtml_function_coverage=1 00:05:11.145 --rc genhtml_legend=1 00:05:11.145 --rc geninfo_all_blocks=1 00:05:11.145 --rc geninfo_unexecuted_blocks=1 00:05:11.145 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:11.145 ' 00:05:11.145 06:14:40 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:11.145 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:11.145 --rc genhtml_branch_coverage=1 00:05:11.145 --rc genhtml_function_coverage=1 00:05:11.145 --rc genhtml_legend=1 00:05:11.145 --rc geninfo_all_blocks=1 00:05:11.145 --rc geninfo_unexecuted_blocks=1 00:05:11.145 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:11.145 ' 00:05:11.145 06:14:40 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:11.145 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:11.145 --rc genhtml_branch_coverage=1 00:05:11.145 --rc genhtml_function_coverage=1 00:05:11.145 --rc genhtml_legend=1 00:05:11.145 --rc geninfo_all_blocks=1 00:05:11.145 --rc geninfo_unexecuted_blocks=1 00:05:11.145 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:11.145 ' 00:05:11.145 06:14:40 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:11.145 06:14:40 -- scheduler/scheduler.sh@35 -- # scheduler_pid=8301 00:05:11.145 06:14:40 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:11.145 06:14:40 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:11.145 06:14:40 -- scheduler/scheduler.sh@37 -- # waitforlisten 8301 00:05:11.145 06:14:40 -- common/autotest_common.sh@829 -- # '[' -z 8301 ']' 00:05:11.145 06:14:40 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:11.145 06:14:40 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:11.145 06:14:40 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:11.145 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:11.145 06:14:40 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:11.145 06:14:40 -- common/autotest_common.sh@10 -- # set +x 00:05:11.145 [2024-11-27 06:14:40.503638] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:11.145 [2024-11-27 06:14:40.503711] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid8301 ] 00:05:11.145 EAL: No free 2048 kB hugepages reported on node 1 00:05:11.145 [2024-11-27 06:14:40.568646] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:11.145 [2024-11-27 06:14:40.646878] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:11.145 [2024-11-27 06:14:40.646965] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:11.145 [2024-11-27 06:14:40.647049] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:11.145 [2024-11-27 06:14:40.647050] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:12.127 06:14:41 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:12.127 06:14:41 -- common/autotest_common.sh@862 -- # return 0 00:05:12.127 06:14:41 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:12.127 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.127 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.127 POWER: Env isn't set yet! 00:05:12.127 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:12.127 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:12.127 POWER: Cannot set governor of lcore 0 to userspace 00:05:12.127 POWER: Attempting to initialise PSTAT power management... 00:05:12.127 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:12.127 POWER: Initialized successfully for lcore 0 power management 00:05:12.127 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:12.127 POWER: Initialized successfully for lcore 1 power management 00:05:12.127 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:12.127 POWER: Initialized successfully for lcore 2 power management 00:05:12.127 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:12.127 POWER: Initialized successfully for lcore 3 power management 00:05:12.127 [2024-11-27 06:14:41.399899] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:12.127 [2024-11-27 06:14:41.399916] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:12.127 [2024-11-27 06:14:41.399927] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:12.127 06:14:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:12.127 06:14:41 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:12.127 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.127 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.127 [2024-11-27 06:14:41.471245] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:12.127 06:14:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:12.128 06:14:41 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:12.128 06:14:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:12.128 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.128 ************************************ 00:05:12.128 START TEST scheduler_create_thread 00:05:12.128 ************************************ 00:05:12.128 06:14:41 -- common/autotest_common.sh@1114 -- # scheduler_create_thread 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:12.128 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.128 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.128 2 00:05:12.128 06:14:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:12.128 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.128 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.128 3 00:05:12.128 06:14:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:12.128 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.128 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.128 4 00:05:12.128 06:14:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:12.128 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.128 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.128 5 00:05:12.128 06:14:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:12.128 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.128 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.128 6 00:05:12.128 06:14:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:12.128 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.128 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.128 7 00:05:12.128 06:14:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:12.128 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.128 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.128 8 00:05:12.128 06:14:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:12.128 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.128 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.128 9 00:05:12.128 06:14:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:12.128 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.128 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.128 10 00:05:12.128 06:14:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:12.128 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.128 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:12.128 06:14:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:12.128 06:14:41 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:12.128 06:14:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:12.128 06:14:41 -- common/autotest_common.sh@10 -- # set +x 00:05:13.147 06:14:42 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:13.147 06:14:42 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:13.147 06:14:42 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:13.147 06:14:42 -- common/autotest_common.sh@10 -- # set +x 00:05:14.528 06:14:43 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:14.528 06:14:43 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:14.528 06:14:43 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:14.528 06:14:43 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:14.528 06:14:43 -- common/autotest_common.sh@10 -- # set +x 00:05:15.464 06:14:44 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:15.464 00:05:15.464 real 0m3.382s 00:05:15.464 user 0m0.022s 00:05:15.464 sys 0m0.009s 00:05:15.464 06:14:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:15.464 06:14:44 -- common/autotest_common.sh@10 -- # set +x 00:05:15.464 ************************************ 00:05:15.464 END TEST scheduler_create_thread 00:05:15.464 ************************************ 00:05:15.464 06:14:44 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:15.464 06:14:44 -- scheduler/scheduler.sh@46 -- # killprocess 8301 00:05:15.464 06:14:44 -- common/autotest_common.sh@936 -- # '[' -z 8301 ']' 00:05:15.464 06:14:44 -- common/autotest_common.sh@940 -- # kill -0 8301 00:05:15.464 06:14:44 -- common/autotest_common.sh@941 -- # uname 00:05:15.464 06:14:44 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:15.464 06:14:44 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 8301 00:05:15.464 06:14:44 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:15.464 06:14:44 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:15.464 06:14:44 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 8301' 00:05:15.464 killing process with pid 8301 00:05:15.464 06:14:44 -- common/autotest_common.sh@955 -- # kill 8301 00:05:15.464 06:14:44 -- common/autotest_common.sh@960 -- # wait 8301 00:05:15.722 [2024-11-27 06:14:45.243177] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:15.982 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:15.982 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:15.982 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:15.982 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:15.982 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:15.982 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:15.982 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:15.982 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:15.982 00:05:15.982 real 0m5.164s 00:05:15.982 user 0m10.630s 00:05:15.982 sys 0m0.432s 00:05:15.982 06:14:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:15.982 06:14:45 -- common/autotest_common.sh@10 -- # set +x 00:05:15.982 ************************************ 00:05:15.982 END TEST event_scheduler 00:05:15.982 ************************************ 00:05:15.982 06:14:45 -- event/event.sh@51 -- # modprobe -n nbd 00:05:15.982 06:14:45 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:15.982 06:14:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:15.982 06:14:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:15.982 06:14:45 -- common/autotest_common.sh@10 -- # set +x 00:05:15.982 ************************************ 00:05:15.982 START TEST app_repeat 00:05:15.982 ************************************ 00:05:15.982 06:14:45 -- common/autotest_common.sh@1114 -- # app_repeat_test 00:05:15.982 06:14:45 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:15.982 06:14:45 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:15.982 06:14:45 -- event/event.sh@13 -- # local nbd_list 00:05:15.982 06:14:45 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:16.241 06:14:45 -- event/event.sh@14 -- # local bdev_list 00:05:16.241 06:14:45 -- event/event.sh@15 -- # local repeat_times=4 00:05:16.241 06:14:45 -- event/event.sh@17 -- # modprobe nbd 00:05:16.241 06:14:45 -- event/event.sh@19 -- # repeat_pid=9181 00:05:16.241 06:14:45 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:16.241 06:14:45 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 9181' 00:05:16.241 Process app_repeat pid: 9181 00:05:16.241 06:14:45 -- event/event.sh@23 -- # for i in {0..2} 00:05:16.241 06:14:45 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:16.241 spdk_app_start Round 0 00:05:16.241 06:14:45 -- event/event.sh@25 -- # waitforlisten 9181 /var/tmp/spdk-nbd.sock 00:05:16.241 06:14:45 -- common/autotest_common.sh@829 -- # '[' -z 9181 ']' 00:05:16.241 06:14:45 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:16.241 06:14:45 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:16.241 06:14:45 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:16.241 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:16.241 06:14:45 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:16.241 06:14:45 -- common/autotest_common.sh@10 -- # set +x 00:05:16.241 06:14:45 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:16.241 [2024-11-27 06:14:45.544032] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:16.241 [2024-11-27 06:14:45.544122] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid9181 ] 00:05:16.241 EAL: No free 2048 kB hugepages reported on node 1 00:05:16.241 [2024-11-27 06:14:45.615992] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:16.241 [2024-11-27 06:14:45.691142] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:16.241 [2024-11-27 06:14:45.691146] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:17.175 06:14:46 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:17.175 06:14:46 -- common/autotest_common.sh@862 -- # return 0 00:05:17.175 06:14:46 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:17.175 Malloc0 00:05:17.175 06:14:46 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:17.434 Malloc1 00:05:17.434 06:14:46 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@12 -- # local i 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:17.434 /dev/nbd0 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:17.434 06:14:46 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:17.434 06:14:46 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:17.434 06:14:46 -- common/autotest_common.sh@867 -- # local i 00:05:17.434 06:14:46 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:17.434 06:14:46 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:17.435 06:14:46 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:17.435 06:14:46 -- common/autotest_common.sh@871 -- # break 00:05:17.435 06:14:46 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:17.435 06:14:46 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:17.435 06:14:46 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:17.435 1+0 records in 00:05:17.435 1+0 records out 00:05:17.435 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000219068 s, 18.7 MB/s 00:05:17.435 06:14:46 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:17.435 06:14:46 -- common/autotest_common.sh@884 -- # size=4096 00:05:17.435 06:14:46 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:17.435 06:14:46 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:17.435 06:14:46 -- common/autotest_common.sh@887 -- # return 0 00:05:17.435 06:14:46 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:17.435 06:14:46 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:17.435 06:14:46 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:17.694 /dev/nbd1 00:05:17.694 06:14:47 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:17.694 06:14:47 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:17.694 06:14:47 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:17.694 06:14:47 -- common/autotest_common.sh@867 -- # local i 00:05:17.694 06:14:47 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:17.694 06:14:47 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:17.694 06:14:47 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:17.694 06:14:47 -- common/autotest_common.sh@871 -- # break 00:05:17.694 06:14:47 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:17.694 06:14:47 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:17.694 06:14:47 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:17.694 1+0 records in 00:05:17.694 1+0 records out 00:05:17.694 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000202927 s, 20.2 MB/s 00:05:17.694 06:14:47 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:17.694 06:14:47 -- common/autotest_common.sh@884 -- # size=4096 00:05:17.694 06:14:47 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:17.694 06:14:47 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:17.694 06:14:47 -- common/autotest_common.sh@887 -- # return 0 00:05:17.694 06:14:47 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:17.694 06:14:47 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:17.694 06:14:47 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:17.694 06:14:47 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:17.694 06:14:47 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:17.954 { 00:05:17.954 "nbd_device": "/dev/nbd0", 00:05:17.954 "bdev_name": "Malloc0" 00:05:17.954 }, 00:05:17.954 { 00:05:17.954 "nbd_device": "/dev/nbd1", 00:05:17.954 "bdev_name": "Malloc1" 00:05:17.954 } 00:05:17.954 ]' 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:17.954 { 00:05:17.954 "nbd_device": "/dev/nbd0", 00:05:17.954 "bdev_name": "Malloc0" 00:05:17.954 }, 00:05:17.954 { 00:05:17.954 "nbd_device": "/dev/nbd1", 00:05:17.954 "bdev_name": "Malloc1" 00:05:17.954 } 00:05:17.954 ]' 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:17.954 /dev/nbd1' 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:17.954 /dev/nbd1' 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@65 -- # count=2 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@95 -- # count=2 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:17.954 256+0 records in 00:05:17.954 256+0 records out 00:05:17.954 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109724 s, 95.6 MB/s 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:17.954 256+0 records in 00:05:17.954 256+0 records out 00:05:17.954 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0201048 s, 52.2 MB/s 00:05:17.954 06:14:47 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:17.955 256+0 records in 00:05:17.955 256+0 records out 00:05:17.955 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212803 s, 49.3 MB/s 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@51 -- # local i 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:17.955 06:14:47 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:18.214 06:14:47 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:18.214 06:14:47 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:18.214 06:14:47 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:18.214 06:14:47 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:18.214 06:14:47 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:18.214 06:14:47 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:18.214 06:14:47 -- bdev/nbd_common.sh@41 -- # break 00:05:18.214 06:14:47 -- bdev/nbd_common.sh@45 -- # return 0 00:05:18.214 06:14:47 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:18.214 06:14:47 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:18.474 06:14:47 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:18.474 06:14:47 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:18.474 06:14:47 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:18.474 06:14:47 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:18.474 06:14:47 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:18.474 06:14:47 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:18.474 06:14:47 -- bdev/nbd_common.sh@41 -- # break 00:05:18.474 06:14:47 -- bdev/nbd_common.sh@45 -- # return 0 00:05:18.474 06:14:47 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:18.474 06:14:47 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:18.474 06:14:47 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:18.733 06:14:48 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:18.733 06:14:48 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:18.733 06:14:48 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:18.733 06:14:48 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:18.733 06:14:48 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:18.733 06:14:48 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:18.733 06:14:48 -- bdev/nbd_common.sh@65 -- # true 00:05:18.733 06:14:48 -- bdev/nbd_common.sh@65 -- # count=0 00:05:18.733 06:14:48 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:18.733 06:14:48 -- bdev/nbd_common.sh@104 -- # count=0 00:05:18.733 06:14:48 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:18.733 06:14:48 -- bdev/nbd_common.sh@109 -- # return 0 00:05:18.733 06:14:48 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:18.992 06:14:48 -- event/event.sh@35 -- # sleep 3 00:05:18.992 [2024-11-27 06:14:48.493495] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:19.253 [2024-11-27 06:14:48.558759] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:19.253 [2024-11-27 06:14:48.558761] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:19.253 [2024-11-27 06:14:48.599759] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:19.253 [2024-11-27 06:14:48.599803] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:21.788 06:14:51 -- event/event.sh@23 -- # for i in {0..2} 00:05:21.788 06:14:51 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:21.788 spdk_app_start Round 1 00:05:21.788 06:14:51 -- event/event.sh@25 -- # waitforlisten 9181 /var/tmp/spdk-nbd.sock 00:05:21.788 06:14:51 -- common/autotest_common.sh@829 -- # '[' -z 9181 ']' 00:05:21.788 06:14:51 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:21.788 06:14:51 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:21.788 06:14:51 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:21.788 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:21.788 06:14:51 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:21.788 06:14:51 -- common/autotest_common.sh@10 -- # set +x 00:05:22.047 06:14:51 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:22.047 06:14:51 -- common/autotest_common.sh@862 -- # return 0 00:05:22.048 06:14:51 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:22.307 Malloc0 00:05:22.307 06:14:51 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:22.307 Malloc1 00:05:22.566 06:14:51 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@12 -- # local i 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:22.566 06:14:51 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:22.566 /dev/nbd0 00:05:22.566 06:14:52 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:22.566 06:14:52 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:22.566 06:14:52 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:22.566 06:14:52 -- common/autotest_common.sh@867 -- # local i 00:05:22.566 06:14:52 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:22.566 06:14:52 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:22.566 06:14:52 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:22.566 06:14:52 -- common/autotest_common.sh@871 -- # break 00:05:22.566 06:14:52 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:22.566 06:14:52 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:22.566 06:14:52 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:22.566 1+0 records in 00:05:22.566 1+0 records out 00:05:22.566 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270669 s, 15.1 MB/s 00:05:22.566 06:14:52 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:22.566 06:14:52 -- common/autotest_common.sh@884 -- # size=4096 00:05:22.566 06:14:52 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:22.566 06:14:52 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:22.566 06:14:52 -- common/autotest_common.sh@887 -- # return 0 00:05:22.566 06:14:52 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:22.566 06:14:52 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:22.566 06:14:52 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:22.825 /dev/nbd1 00:05:22.825 06:14:52 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:22.825 06:14:52 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:22.825 06:14:52 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:22.825 06:14:52 -- common/autotest_common.sh@867 -- # local i 00:05:22.825 06:14:52 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:22.825 06:14:52 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:22.825 06:14:52 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:22.825 06:14:52 -- common/autotest_common.sh@871 -- # break 00:05:22.825 06:14:52 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:22.825 06:14:52 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:22.825 06:14:52 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:22.825 1+0 records in 00:05:22.825 1+0 records out 00:05:22.825 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254505 s, 16.1 MB/s 00:05:22.825 06:14:52 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:22.825 06:14:52 -- common/autotest_common.sh@884 -- # size=4096 00:05:22.825 06:14:52 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:22.825 06:14:52 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:22.825 06:14:52 -- common/autotest_common.sh@887 -- # return 0 00:05:22.825 06:14:52 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:22.825 06:14:52 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:22.825 06:14:52 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:22.825 06:14:52 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:22.825 06:14:52 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:23.085 { 00:05:23.085 "nbd_device": "/dev/nbd0", 00:05:23.085 "bdev_name": "Malloc0" 00:05:23.085 }, 00:05:23.085 { 00:05:23.085 "nbd_device": "/dev/nbd1", 00:05:23.085 "bdev_name": "Malloc1" 00:05:23.085 } 00:05:23.085 ]' 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:23.085 { 00:05:23.085 "nbd_device": "/dev/nbd0", 00:05:23.085 "bdev_name": "Malloc0" 00:05:23.085 }, 00:05:23.085 { 00:05:23.085 "nbd_device": "/dev/nbd1", 00:05:23.085 "bdev_name": "Malloc1" 00:05:23.085 } 00:05:23.085 ]' 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:23.085 /dev/nbd1' 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:23.085 /dev/nbd1' 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@65 -- # count=2 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@95 -- # count=2 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:23.085 06:14:52 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:23.086 256+0 records in 00:05:23.086 256+0 records out 00:05:23.086 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109734 s, 95.6 MB/s 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:23.086 256+0 records in 00:05:23.086 256+0 records out 00:05:23.086 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197697 s, 53.0 MB/s 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:23.086 256+0 records in 00:05:23.086 256+0 records out 00:05:23.086 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.021302 s, 49.2 MB/s 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@51 -- # local i 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:23.086 06:14:52 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:23.346 06:14:52 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:23.346 06:14:52 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:23.346 06:14:52 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:23.346 06:14:52 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:23.346 06:14:52 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:23.346 06:14:52 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:23.346 06:14:52 -- bdev/nbd_common.sh@41 -- # break 00:05:23.346 06:14:52 -- bdev/nbd_common.sh@45 -- # return 0 00:05:23.346 06:14:52 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:23.346 06:14:52 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:23.605 06:14:52 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:23.605 06:14:52 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:23.605 06:14:52 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:23.605 06:14:52 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:23.605 06:14:52 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:23.605 06:14:52 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:23.605 06:14:53 -- bdev/nbd_common.sh@41 -- # break 00:05:23.605 06:14:53 -- bdev/nbd_common.sh@45 -- # return 0 00:05:23.605 06:14:53 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:23.605 06:14:53 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:23.605 06:14:53 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:23.864 06:14:53 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:23.864 06:14:53 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:23.864 06:14:53 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:23.864 06:14:53 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:23.864 06:14:53 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:23.864 06:14:53 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:23.864 06:14:53 -- bdev/nbd_common.sh@65 -- # true 00:05:23.864 06:14:53 -- bdev/nbd_common.sh@65 -- # count=0 00:05:23.864 06:14:53 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:23.864 06:14:53 -- bdev/nbd_common.sh@104 -- # count=0 00:05:23.864 06:14:53 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:23.864 06:14:53 -- bdev/nbd_common.sh@109 -- # return 0 00:05:23.864 06:14:53 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:24.123 06:14:53 -- event/event.sh@35 -- # sleep 3 00:05:24.123 [2024-11-27 06:14:53.592551] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:24.123 [2024-11-27 06:14:53.657006] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:24.123 [2024-11-27 06:14:53.657009] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:24.381 [2024-11-27 06:14:53.697681] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:24.381 [2024-11-27 06:14:53.697725] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:26.918 06:14:56 -- event/event.sh@23 -- # for i in {0..2} 00:05:26.918 06:14:56 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:26.918 spdk_app_start Round 2 00:05:26.918 06:14:56 -- event/event.sh@25 -- # waitforlisten 9181 /var/tmp/spdk-nbd.sock 00:05:26.918 06:14:56 -- common/autotest_common.sh@829 -- # '[' -z 9181 ']' 00:05:26.918 06:14:56 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:26.918 06:14:56 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:26.918 06:14:56 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:26.918 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:26.918 06:14:56 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:26.918 06:14:56 -- common/autotest_common.sh@10 -- # set +x 00:05:27.178 06:14:56 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:27.178 06:14:56 -- common/autotest_common.sh@862 -- # return 0 00:05:27.178 06:14:56 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:27.438 Malloc0 00:05:27.438 06:14:56 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:27.438 Malloc1 00:05:27.698 06:14:56 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@12 -- # local i 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:27.698 06:14:56 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:27.698 /dev/nbd0 00:05:27.698 06:14:57 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:27.698 06:14:57 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:27.698 06:14:57 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:27.698 06:14:57 -- common/autotest_common.sh@867 -- # local i 00:05:27.698 06:14:57 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:27.698 06:14:57 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:27.698 06:14:57 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:27.698 06:14:57 -- common/autotest_common.sh@871 -- # break 00:05:27.698 06:14:57 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:27.698 06:14:57 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:27.698 06:14:57 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:27.698 1+0 records in 00:05:27.698 1+0 records out 00:05:27.698 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000219823 s, 18.6 MB/s 00:05:27.698 06:14:57 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:27.698 06:14:57 -- common/autotest_common.sh@884 -- # size=4096 00:05:27.698 06:14:57 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:27.698 06:14:57 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:27.698 06:14:57 -- common/autotest_common.sh@887 -- # return 0 00:05:27.698 06:14:57 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:27.698 06:14:57 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:27.698 06:14:57 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:27.958 /dev/nbd1 00:05:27.958 06:14:57 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:27.958 06:14:57 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:27.958 06:14:57 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:27.958 06:14:57 -- common/autotest_common.sh@867 -- # local i 00:05:27.958 06:14:57 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:27.958 06:14:57 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:27.958 06:14:57 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:27.958 06:14:57 -- common/autotest_common.sh@871 -- # break 00:05:27.958 06:14:57 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:27.958 06:14:57 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:27.958 06:14:57 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:27.958 1+0 records in 00:05:27.958 1+0 records out 00:05:27.958 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266275 s, 15.4 MB/s 00:05:27.958 06:14:57 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:27.958 06:14:57 -- common/autotest_common.sh@884 -- # size=4096 00:05:27.958 06:14:57 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:27.958 06:14:57 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:27.958 06:14:57 -- common/autotest_common.sh@887 -- # return 0 00:05:27.958 06:14:57 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:27.958 06:14:57 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:27.958 06:14:57 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:27.958 06:14:57 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:27.958 06:14:57 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:28.218 { 00:05:28.218 "nbd_device": "/dev/nbd0", 00:05:28.218 "bdev_name": "Malloc0" 00:05:28.218 }, 00:05:28.218 { 00:05:28.218 "nbd_device": "/dev/nbd1", 00:05:28.218 "bdev_name": "Malloc1" 00:05:28.218 } 00:05:28.218 ]' 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:28.218 { 00:05:28.218 "nbd_device": "/dev/nbd0", 00:05:28.218 "bdev_name": "Malloc0" 00:05:28.218 }, 00:05:28.218 { 00:05:28.218 "nbd_device": "/dev/nbd1", 00:05:28.218 "bdev_name": "Malloc1" 00:05:28.218 } 00:05:28.218 ]' 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:28.218 /dev/nbd1' 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:28.218 /dev/nbd1' 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@65 -- # count=2 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@95 -- # count=2 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:28.218 256+0 records in 00:05:28.218 256+0 records out 00:05:28.218 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106842 s, 98.1 MB/s 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:28.218 256+0 records in 00:05:28.218 256+0 records out 00:05:28.218 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200331 s, 52.3 MB/s 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:28.218 256+0 records in 00:05:28.218 256+0 records out 00:05:28.218 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0211953 s, 49.5 MB/s 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@51 -- # local i 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:28.218 06:14:57 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:28.478 06:14:57 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:28.478 06:14:57 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:28.478 06:14:57 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:28.478 06:14:57 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:28.478 06:14:57 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:28.478 06:14:57 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:28.478 06:14:57 -- bdev/nbd_common.sh@41 -- # break 00:05:28.478 06:14:57 -- bdev/nbd_common.sh@45 -- # return 0 00:05:28.478 06:14:57 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:28.478 06:14:57 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:28.738 06:14:58 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:28.738 06:14:58 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:28.738 06:14:58 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:28.738 06:14:58 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:28.738 06:14:58 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:28.738 06:14:58 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:28.738 06:14:58 -- bdev/nbd_common.sh@41 -- # break 00:05:28.738 06:14:58 -- bdev/nbd_common.sh@45 -- # return 0 00:05:28.738 06:14:58 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:28.738 06:14:58 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:28.738 06:14:58 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:28.998 06:14:58 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:28.998 06:14:58 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:28.998 06:14:58 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:28.998 06:14:58 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:28.998 06:14:58 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:28.998 06:14:58 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:28.998 06:14:58 -- bdev/nbd_common.sh@65 -- # true 00:05:28.998 06:14:58 -- bdev/nbd_common.sh@65 -- # count=0 00:05:28.998 06:14:58 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:28.998 06:14:58 -- bdev/nbd_common.sh@104 -- # count=0 00:05:28.998 06:14:58 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:28.998 06:14:58 -- bdev/nbd_common.sh@109 -- # return 0 00:05:28.998 06:14:58 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:29.257 06:14:58 -- event/event.sh@35 -- # sleep 3 00:05:29.257 [2024-11-27 06:14:58.727339] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:29.257 [2024-11-27 06:14:58.792344] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:29.257 [2024-11-27 06:14:58.792346] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:29.516 [2024-11-27 06:14:58.833284] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:29.516 [2024-11-27 06:14:58.833329] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:32.052 06:15:01 -- event/event.sh@38 -- # waitforlisten 9181 /var/tmp/spdk-nbd.sock 00:05:32.052 06:15:01 -- common/autotest_common.sh@829 -- # '[' -z 9181 ']' 00:05:32.052 06:15:01 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:32.052 06:15:01 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:32.052 06:15:01 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:32.052 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:32.052 06:15:01 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:32.052 06:15:01 -- common/autotest_common.sh@10 -- # set +x 00:05:32.311 06:15:01 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:32.311 06:15:01 -- common/autotest_common.sh@862 -- # return 0 00:05:32.311 06:15:01 -- event/event.sh@39 -- # killprocess 9181 00:05:32.311 06:15:01 -- common/autotest_common.sh@936 -- # '[' -z 9181 ']' 00:05:32.311 06:15:01 -- common/autotest_common.sh@940 -- # kill -0 9181 00:05:32.311 06:15:01 -- common/autotest_common.sh@941 -- # uname 00:05:32.311 06:15:01 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:32.311 06:15:01 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 9181 00:05:32.311 06:15:01 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:32.311 06:15:01 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:32.311 06:15:01 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 9181' 00:05:32.311 killing process with pid 9181 00:05:32.311 06:15:01 -- common/autotest_common.sh@955 -- # kill 9181 00:05:32.311 06:15:01 -- common/autotest_common.sh@960 -- # wait 9181 00:05:32.571 spdk_app_start is called in Round 0. 00:05:32.571 Shutdown signal received, stop current app iteration 00:05:32.571 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:05:32.571 spdk_app_start is called in Round 1. 00:05:32.571 Shutdown signal received, stop current app iteration 00:05:32.571 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:05:32.571 spdk_app_start is called in Round 2. 00:05:32.571 Shutdown signal received, stop current app iteration 00:05:32.571 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:05:32.571 spdk_app_start is called in Round 3. 00:05:32.571 Shutdown signal received, stop current app iteration 00:05:32.571 06:15:01 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:05:32.571 06:15:01 -- event/event.sh@42 -- # return 0 00:05:32.571 00:05:32.571 real 0m16.435s 00:05:32.571 user 0m35.008s 00:05:32.571 sys 0m3.081s 00:05:32.571 06:15:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:32.571 06:15:01 -- common/autotest_common.sh@10 -- # set +x 00:05:32.571 ************************************ 00:05:32.571 END TEST app_repeat 00:05:32.571 ************************************ 00:05:32.571 06:15:01 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:05:32.571 06:15:01 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:32.571 06:15:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:32.571 06:15:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:32.571 06:15:01 -- common/autotest_common.sh@10 -- # set +x 00:05:32.571 ************************************ 00:05:32.571 START TEST cpu_locks 00:05:32.571 ************************************ 00:05:32.571 06:15:01 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:32.571 * Looking for test storage... 00:05:32.571 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:32.571 06:15:02 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:32.571 06:15:02 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:32.571 06:15:02 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:32.831 06:15:02 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:32.831 06:15:02 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:32.831 06:15:02 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:32.831 06:15:02 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:32.831 06:15:02 -- scripts/common.sh@335 -- # IFS=.-: 00:05:32.831 06:15:02 -- scripts/common.sh@335 -- # read -ra ver1 00:05:32.831 06:15:02 -- scripts/common.sh@336 -- # IFS=.-: 00:05:32.831 06:15:02 -- scripts/common.sh@336 -- # read -ra ver2 00:05:32.831 06:15:02 -- scripts/common.sh@337 -- # local 'op=<' 00:05:32.831 06:15:02 -- scripts/common.sh@339 -- # ver1_l=2 00:05:32.831 06:15:02 -- scripts/common.sh@340 -- # ver2_l=1 00:05:32.831 06:15:02 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:32.831 06:15:02 -- scripts/common.sh@343 -- # case "$op" in 00:05:32.831 06:15:02 -- scripts/common.sh@344 -- # : 1 00:05:32.831 06:15:02 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:32.831 06:15:02 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:32.831 06:15:02 -- scripts/common.sh@364 -- # decimal 1 00:05:32.831 06:15:02 -- scripts/common.sh@352 -- # local d=1 00:05:32.831 06:15:02 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:32.831 06:15:02 -- scripts/common.sh@354 -- # echo 1 00:05:32.831 06:15:02 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:32.831 06:15:02 -- scripts/common.sh@365 -- # decimal 2 00:05:32.831 06:15:02 -- scripts/common.sh@352 -- # local d=2 00:05:32.831 06:15:02 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:32.831 06:15:02 -- scripts/common.sh@354 -- # echo 2 00:05:32.831 06:15:02 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:32.831 06:15:02 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:32.831 06:15:02 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:32.831 06:15:02 -- scripts/common.sh@367 -- # return 0 00:05:32.831 06:15:02 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:32.831 06:15:02 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:32.831 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.831 --rc genhtml_branch_coverage=1 00:05:32.831 --rc genhtml_function_coverage=1 00:05:32.831 --rc genhtml_legend=1 00:05:32.831 --rc geninfo_all_blocks=1 00:05:32.831 --rc geninfo_unexecuted_blocks=1 00:05:32.831 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.831 ' 00:05:32.831 06:15:02 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:32.831 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.831 --rc genhtml_branch_coverage=1 00:05:32.831 --rc genhtml_function_coverage=1 00:05:32.831 --rc genhtml_legend=1 00:05:32.831 --rc geninfo_all_blocks=1 00:05:32.831 --rc geninfo_unexecuted_blocks=1 00:05:32.831 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.831 ' 00:05:32.831 06:15:02 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:32.831 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.831 --rc genhtml_branch_coverage=1 00:05:32.831 --rc genhtml_function_coverage=1 00:05:32.831 --rc genhtml_legend=1 00:05:32.831 --rc geninfo_all_blocks=1 00:05:32.831 --rc geninfo_unexecuted_blocks=1 00:05:32.831 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.831 ' 00:05:32.831 06:15:02 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:32.831 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.831 --rc genhtml_branch_coverage=1 00:05:32.831 --rc genhtml_function_coverage=1 00:05:32.831 --rc genhtml_legend=1 00:05:32.831 --rc geninfo_all_blocks=1 00:05:32.831 --rc geninfo_unexecuted_blocks=1 00:05:32.831 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.831 ' 00:05:32.831 06:15:02 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:05:32.831 06:15:02 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:05:32.831 06:15:02 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:05:32.831 06:15:02 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:05:32.831 06:15:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:32.831 06:15:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:32.831 06:15:02 -- common/autotest_common.sh@10 -- # set +x 00:05:32.831 ************************************ 00:05:32.831 START TEST default_locks 00:05:32.831 ************************************ 00:05:32.831 06:15:02 -- common/autotest_common.sh@1114 -- # default_locks 00:05:32.831 06:15:02 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=12504 00:05:32.831 06:15:02 -- event/cpu_locks.sh@47 -- # waitforlisten 12504 00:05:32.831 06:15:02 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:32.831 06:15:02 -- common/autotest_common.sh@829 -- # '[' -z 12504 ']' 00:05:32.831 06:15:02 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:32.831 06:15:02 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:32.831 06:15:02 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:32.831 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:32.831 06:15:02 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:32.831 06:15:02 -- common/autotest_common.sh@10 -- # set +x 00:05:32.831 [2024-11-27 06:15:02.217969] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:32.831 [2024-11-27 06:15:02.218061] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid12504 ] 00:05:32.831 EAL: No free 2048 kB hugepages reported on node 1 00:05:32.831 [2024-11-27 06:15:02.287087] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:32.831 [2024-11-27 06:15:02.363105] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:32.831 [2024-11-27 06:15:02.363213] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:33.770 06:15:03 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:33.770 06:15:03 -- common/autotest_common.sh@862 -- # return 0 00:05:33.770 06:15:03 -- event/cpu_locks.sh@49 -- # locks_exist 12504 00:05:33.770 06:15:03 -- event/cpu_locks.sh@22 -- # lslocks -p 12504 00:05:33.770 06:15:03 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:34.029 lslocks: write error 00:05:34.029 06:15:03 -- event/cpu_locks.sh@50 -- # killprocess 12504 00:05:34.029 06:15:03 -- common/autotest_common.sh@936 -- # '[' -z 12504 ']' 00:05:34.029 06:15:03 -- common/autotest_common.sh@940 -- # kill -0 12504 00:05:34.029 06:15:03 -- common/autotest_common.sh@941 -- # uname 00:05:34.029 06:15:03 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:34.029 06:15:03 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 12504 00:05:34.289 06:15:03 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:34.289 06:15:03 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:34.289 06:15:03 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 12504' 00:05:34.289 killing process with pid 12504 00:05:34.289 06:15:03 -- common/autotest_common.sh@955 -- # kill 12504 00:05:34.289 06:15:03 -- common/autotest_common.sh@960 -- # wait 12504 00:05:34.549 06:15:03 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 12504 00:05:34.549 06:15:03 -- common/autotest_common.sh@650 -- # local es=0 00:05:34.549 06:15:03 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 12504 00:05:34.549 06:15:03 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:05:34.549 06:15:03 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:34.549 06:15:03 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:05:34.549 06:15:03 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:34.549 06:15:03 -- common/autotest_common.sh@653 -- # waitforlisten 12504 00:05:34.549 06:15:03 -- common/autotest_common.sh@829 -- # '[' -z 12504 ']' 00:05:34.549 06:15:03 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:34.549 06:15:03 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:34.549 06:15:03 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:34.549 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:34.549 06:15:03 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:34.549 06:15:03 -- common/autotest_common.sh@10 -- # set +x 00:05:34.549 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (12504) - No such process 00:05:34.549 ERROR: process (pid: 12504) is no longer running 00:05:34.549 06:15:03 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:34.549 06:15:03 -- common/autotest_common.sh@862 -- # return 1 00:05:34.549 06:15:03 -- common/autotest_common.sh@653 -- # es=1 00:05:34.549 06:15:03 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:34.549 06:15:03 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:34.550 06:15:03 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:34.550 06:15:03 -- event/cpu_locks.sh@54 -- # no_locks 00:05:34.550 06:15:03 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:34.550 06:15:03 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:34.550 06:15:03 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:34.550 00:05:34.550 real 0m1.727s 00:05:34.550 user 0m1.831s 00:05:34.550 sys 0m0.591s 00:05:34.550 06:15:03 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:34.550 06:15:03 -- common/autotest_common.sh@10 -- # set +x 00:05:34.550 ************************************ 00:05:34.550 END TEST default_locks 00:05:34.550 ************************************ 00:05:34.550 06:15:03 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:05:34.550 06:15:03 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:34.550 06:15:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:34.550 06:15:03 -- common/autotest_common.sh@10 -- # set +x 00:05:34.550 ************************************ 00:05:34.550 START TEST default_locks_via_rpc 00:05:34.550 ************************************ 00:05:34.550 06:15:03 -- common/autotest_common.sh@1114 -- # default_locks_via_rpc 00:05:34.550 06:15:03 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=12874 00:05:34.550 06:15:03 -- event/cpu_locks.sh@63 -- # waitforlisten 12874 00:05:34.550 06:15:03 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:34.550 06:15:03 -- common/autotest_common.sh@829 -- # '[' -z 12874 ']' 00:05:34.550 06:15:03 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:34.550 06:15:03 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:34.550 06:15:03 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:34.550 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:34.550 06:15:03 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:34.550 06:15:03 -- common/autotest_common.sh@10 -- # set +x 00:05:34.550 [2024-11-27 06:15:03.988966] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:34.550 [2024-11-27 06:15:03.989052] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid12874 ] 00:05:34.550 EAL: No free 2048 kB hugepages reported on node 1 00:05:34.550 [2024-11-27 06:15:04.057046] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:34.809 [2024-11-27 06:15:04.132351] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:34.809 [2024-11-27 06:15:04.132461] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:35.393 06:15:04 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:35.393 06:15:04 -- common/autotest_common.sh@862 -- # return 0 00:05:35.393 06:15:04 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:05:35.393 06:15:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:35.393 06:15:04 -- common/autotest_common.sh@10 -- # set +x 00:05:35.393 06:15:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:35.393 06:15:04 -- event/cpu_locks.sh@67 -- # no_locks 00:05:35.393 06:15:04 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:35.393 06:15:04 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:35.393 06:15:04 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:35.393 06:15:04 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:05:35.393 06:15:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:35.393 06:15:04 -- common/autotest_common.sh@10 -- # set +x 00:05:35.393 06:15:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:35.393 06:15:04 -- event/cpu_locks.sh@71 -- # locks_exist 12874 00:05:35.393 06:15:04 -- event/cpu_locks.sh@22 -- # lslocks -p 12874 00:05:35.393 06:15:04 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:35.653 06:15:05 -- event/cpu_locks.sh@73 -- # killprocess 12874 00:05:35.653 06:15:05 -- common/autotest_common.sh@936 -- # '[' -z 12874 ']' 00:05:35.653 06:15:05 -- common/autotest_common.sh@940 -- # kill -0 12874 00:05:35.653 06:15:05 -- common/autotest_common.sh@941 -- # uname 00:05:35.653 06:15:05 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:35.653 06:15:05 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 12874 00:05:35.912 06:15:05 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:35.912 06:15:05 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:35.912 06:15:05 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 12874' 00:05:35.912 killing process with pid 12874 00:05:35.913 06:15:05 -- common/autotest_common.sh@955 -- # kill 12874 00:05:35.913 06:15:05 -- common/autotest_common.sh@960 -- # wait 12874 00:05:36.173 00:05:36.173 real 0m1.564s 00:05:36.173 user 0m1.654s 00:05:36.173 sys 0m0.531s 00:05:36.173 06:15:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:36.173 06:15:05 -- common/autotest_common.sh@10 -- # set +x 00:05:36.173 ************************************ 00:05:36.173 END TEST default_locks_via_rpc 00:05:36.173 ************************************ 00:05:36.173 06:15:05 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:05:36.173 06:15:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:36.173 06:15:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:36.173 06:15:05 -- common/autotest_common.sh@10 -- # set +x 00:05:36.173 ************************************ 00:05:36.173 START TEST non_locking_app_on_locked_coremask 00:05:36.173 ************************************ 00:05:36.173 06:15:05 -- common/autotest_common.sh@1114 -- # non_locking_app_on_locked_coremask 00:05:36.173 06:15:05 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=13524 00:05:36.173 06:15:05 -- event/cpu_locks.sh@81 -- # waitforlisten 13524 /var/tmp/spdk.sock 00:05:36.173 06:15:05 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:36.173 06:15:05 -- common/autotest_common.sh@829 -- # '[' -z 13524 ']' 00:05:36.173 06:15:05 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:36.173 06:15:05 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:36.173 06:15:05 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:36.173 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:36.173 06:15:05 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:36.173 06:15:05 -- common/autotest_common.sh@10 -- # set +x 00:05:36.173 [2024-11-27 06:15:05.601814] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:36.173 [2024-11-27 06:15:05.601890] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid13524 ] 00:05:36.173 EAL: No free 2048 kB hugepages reported on node 1 00:05:36.173 [2024-11-27 06:15:05.670453] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:36.432 [2024-11-27 06:15:05.744593] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:36.432 [2024-11-27 06:15:05.744723] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:37.001 06:15:06 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:37.001 06:15:06 -- common/autotest_common.sh@862 -- # return 0 00:05:37.001 06:15:06 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=13819 00:05:37.001 06:15:06 -- event/cpu_locks.sh@85 -- # waitforlisten 13819 /var/tmp/spdk2.sock 00:05:37.001 06:15:06 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:05:37.001 06:15:06 -- common/autotest_common.sh@829 -- # '[' -z 13819 ']' 00:05:37.001 06:15:06 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:37.001 06:15:06 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:37.001 06:15:06 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:37.001 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:37.001 06:15:06 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:37.001 06:15:06 -- common/autotest_common.sh@10 -- # set +x 00:05:37.001 [2024-11-27 06:15:06.458019] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:37.002 [2024-11-27 06:15:06.458082] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid13819 ] 00:05:37.002 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.261 [2024-11-27 06:15:06.548595] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:37.261 [2024-11-27 06:15:06.552625] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:37.261 [2024-11-27 06:15:06.697329] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:37.261 [2024-11-27 06:15:06.697461] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:37.829 06:15:07 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:37.829 06:15:07 -- common/autotest_common.sh@862 -- # return 0 00:05:37.829 06:15:07 -- event/cpu_locks.sh@87 -- # locks_exist 13524 00:05:37.829 06:15:07 -- event/cpu_locks.sh@22 -- # lslocks -p 13524 00:05:37.829 06:15:07 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:39.208 lslocks: write error 00:05:39.208 06:15:08 -- event/cpu_locks.sh@89 -- # killprocess 13524 00:05:39.208 06:15:08 -- common/autotest_common.sh@936 -- # '[' -z 13524 ']' 00:05:39.208 06:15:08 -- common/autotest_common.sh@940 -- # kill -0 13524 00:05:39.208 06:15:08 -- common/autotest_common.sh@941 -- # uname 00:05:39.208 06:15:08 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:39.208 06:15:08 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 13524 00:05:39.208 06:15:08 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:39.208 06:15:08 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:39.208 06:15:08 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 13524' 00:05:39.208 killing process with pid 13524 00:05:39.208 06:15:08 -- common/autotest_common.sh@955 -- # kill 13524 00:05:39.208 06:15:08 -- common/autotest_common.sh@960 -- # wait 13524 00:05:39.467 06:15:08 -- event/cpu_locks.sh@90 -- # killprocess 13819 00:05:39.467 06:15:08 -- common/autotest_common.sh@936 -- # '[' -z 13819 ']' 00:05:39.467 06:15:08 -- common/autotest_common.sh@940 -- # kill -0 13819 00:05:39.467 06:15:08 -- common/autotest_common.sh@941 -- # uname 00:05:39.467 06:15:08 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:39.467 06:15:08 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 13819 00:05:39.727 06:15:09 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:39.727 06:15:09 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:39.727 06:15:09 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 13819' 00:05:39.727 killing process with pid 13819 00:05:39.727 06:15:09 -- common/autotest_common.sh@955 -- # kill 13819 00:05:39.727 06:15:09 -- common/autotest_common.sh@960 -- # wait 13819 00:05:39.986 00:05:39.986 real 0m3.768s 00:05:39.986 user 0m4.036s 00:05:39.986 sys 0m1.251s 00:05:39.986 06:15:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:39.986 06:15:09 -- common/autotest_common.sh@10 -- # set +x 00:05:39.986 ************************************ 00:05:39.986 END TEST non_locking_app_on_locked_coremask 00:05:39.986 ************************************ 00:05:39.986 06:15:09 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:05:39.986 06:15:09 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:39.986 06:15:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:39.986 06:15:09 -- common/autotest_common.sh@10 -- # set +x 00:05:39.986 ************************************ 00:05:39.986 START TEST locking_app_on_unlocked_coremask 00:05:39.986 ************************************ 00:05:39.986 06:15:09 -- common/autotest_common.sh@1114 -- # locking_app_on_unlocked_coremask 00:05:39.986 06:15:09 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=14393 00:05:39.986 06:15:09 -- event/cpu_locks.sh@99 -- # waitforlisten 14393 /var/tmp/spdk.sock 00:05:39.986 06:15:09 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:05:39.986 06:15:09 -- common/autotest_common.sh@829 -- # '[' -z 14393 ']' 00:05:39.986 06:15:09 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:39.986 06:15:09 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:39.986 06:15:09 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:39.986 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:39.986 06:15:09 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:39.986 06:15:09 -- common/autotest_common.sh@10 -- # set +x 00:05:39.986 [2024-11-27 06:15:09.420908] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:39.986 [2024-11-27 06:15:09.420980] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid14393 ] 00:05:39.986 EAL: No free 2048 kB hugepages reported on node 1 00:05:39.986 [2024-11-27 06:15:09.486639] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:39.986 [2024-11-27 06:15:09.486674] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:40.244 [2024-11-27 06:15:09.550245] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:40.244 [2024-11-27 06:15:09.550354] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:40.812 06:15:10 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:40.812 06:15:10 -- common/autotest_common.sh@862 -- # return 0 00:05:40.812 06:15:10 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=14409 00:05:40.812 06:15:10 -- event/cpu_locks.sh@103 -- # waitforlisten 14409 /var/tmp/spdk2.sock 00:05:40.812 06:15:10 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:40.812 06:15:10 -- common/autotest_common.sh@829 -- # '[' -z 14409 ']' 00:05:40.812 06:15:10 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:40.812 06:15:10 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:40.812 06:15:10 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:40.812 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:40.812 06:15:10 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:40.812 06:15:10 -- common/autotest_common.sh@10 -- # set +x 00:05:40.812 [2024-11-27 06:15:10.271806] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:40.812 [2024-11-27 06:15:10.271870] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid14409 ] 00:05:40.812 EAL: No free 2048 kB hugepages reported on node 1 00:05:41.070 [2024-11-27 06:15:10.364811] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:41.070 [2024-11-27 06:15:10.506633] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:41.070 [2024-11-27 06:15:10.506764] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:41.638 06:15:11 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:41.638 06:15:11 -- common/autotest_common.sh@862 -- # return 0 00:05:41.638 06:15:11 -- event/cpu_locks.sh@105 -- # locks_exist 14409 00:05:41.638 06:15:11 -- event/cpu_locks.sh@22 -- # lslocks -p 14409 00:05:41.638 06:15:11 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:42.577 lslocks: write error 00:05:42.577 06:15:11 -- event/cpu_locks.sh@107 -- # killprocess 14393 00:05:42.577 06:15:11 -- common/autotest_common.sh@936 -- # '[' -z 14393 ']' 00:05:42.577 06:15:11 -- common/autotest_common.sh@940 -- # kill -0 14393 00:05:42.577 06:15:11 -- common/autotest_common.sh@941 -- # uname 00:05:42.577 06:15:11 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:42.577 06:15:11 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 14393 00:05:42.577 06:15:12 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:42.577 06:15:12 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:42.577 06:15:12 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 14393' 00:05:42.577 killing process with pid 14393 00:05:42.577 06:15:12 -- common/autotest_common.sh@955 -- # kill 14393 00:05:42.577 06:15:12 -- common/autotest_common.sh@960 -- # wait 14393 00:05:43.146 06:15:12 -- event/cpu_locks.sh@108 -- # killprocess 14409 00:05:43.146 06:15:12 -- common/autotest_common.sh@936 -- # '[' -z 14409 ']' 00:05:43.146 06:15:12 -- common/autotest_common.sh@940 -- # kill -0 14409 00:05:43.146 06:15:12 -- common/autotest_common.sh@941 -- # uname 00:05:43.146 06:15:12 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:43.146 06:15:12 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 14409 00:05:43.405 06:15:12 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:43.405 06:15:12 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:43.405 06:15:12 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 14409' 00:05:43.405 killing process with pid 14409 00:05:43.405 06:15:12 -- common/autotest_common.sh@955 -- # kill 14409 00:05:43.405 06:15:12 -- common/autotest_common.sh@960 -- # wait 14409 00:05:43.665 00:05:43.665 real 0m3.603s 00:05:43.665 user 0m3.871s 00:05:43.665 sys 0m1.174s 00:05:43.665 06:15:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:43.665 06:15:12 -- common/autotest_common.sh@10 -- # set +x 00:05:43.665 ************************************ 00:05:43.665 END TEST locking_app_on_unlocked_coremask 00:05:43.665 ************************************ 00:05:43.665 06:15:13 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:05:43.665 06:15:13 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:43.665 06:15:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.665 06:15:13 -- common/autotest_common.sh@10 -- # set +x 00:05:43.665 ************************************ 00:05:43.665 START TEST locking_app_on_locked_coremask 00:05:43.665 ************************************ 00:05:43.665 06:15:13 -- common/autotest_common.sh@1114 -- # locking_app_on_locked_coremask 00:05:43.665 06:15:13 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=14976 00:05:43.665 06:15:13 -- event/cpu_locks.sh@116 -- # waitforlisten 14976 /var/tmp/spdk.sock 00:05:43.665 06:15:13 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:43.665 06:15:13 -- common/autotest_common.sh@829 -- # '[' -z 14976 ']' 00:05:43.665 06:15:13 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:43.665 06:15:13 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:43.665 06:15:13 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:43.665 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:43.665 06:15:13 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:43.665 06:15:13 -- common/autotest_common.sh@10 -- # set +x 00:05:43.665 [2024-11-27 06:15:13.072866] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:43.665 [2024-11-27 06:15:13.072935] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid14976 ] 00:05:43.665 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.665 [2024-11-27 06:15:13.140790] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.925 [2024-11-27 06:15:13.207114] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:43.925 [2024-11-27 06:15:13.207237] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.495 06:15:13 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:44.495 06:15:13 -- common/autotest_common.sh@862 -- # return 0 00:05:44.495 06:15:13 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:44.495 06:15:13 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=15248 00:05:44.495 06:15:13 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 15248 /var/tmp/spdk2.sock 00:05:44.495 06:15:13 -- common/autotest_common.sh@650 -- # local es=0 00:05:44.495 06:15:13 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 15248 /var/tmp/spdk2.sock 00:05:44.495 06:15:13 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:05:44.495 06:15:13 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:44.495 06:15:13 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:05:44.495 06:15:13 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:44.495 06:15:13 -- common/autotest_common.sh@653 -- # waitforlisten 15248 /var/tmp/spdk2.sock 00:05:44.495 06:15:13 -- common/autotest_common.sh@829 -- # '[' -z 15248 ']' 00:05:44.495 06:15:13 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:44.495 06:15:13 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:44.495 06:15:13 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:44.495 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:44.495 06:15:13 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:44.495 06:15:13 -- common/autotest_common.sh@10 -- # set +x 00:05:44.495 [2024-11-27 06:15:13.925550] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:44.496 [2024-11-27 06:15:13.925620] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid15248 ] 00:05:44.496 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.496 [2024-11-27 06:15:14.015139] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 14976 has claimed it. 00:05:44.496 [2024-11-27 06:15:14.015180] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:45.064 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (15248) - No such process 00:05:45.064 ERROR: process (pid: 15248) is no longer running 00:05:45.064 06:15:14 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:45.064 06:15:14 -- common/autotest_common.sh@862 -- # return 1 00:05:45.064 06:15:14 -- common/autotest_common.sh@653 -- # es=1 00:05:45.064 06:15:14 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:45.064 06:15:14 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:45.064 06:15:14 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:45.064 06:15:14 -- event/cpu_locks.sh@122 -- # locks_exist 14976 00:05:45.064 06:15:14 -- event/cpu_locks.sh@22 -- # lslocks -p 14976 00:05:45.064 06:15:14 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:45.633 lslocks: write error 00:05:45.633 06:15:15 -- event/cpu_locks.sh@124 -- # killprocess 14976 00:05:45.633 06:15:15 -- common/autotest_common.sh@936 -- # '[' -z 14976 ']' 00:05:45.633 06:15:15 -- common/autotest_common.sh@940 -- # kill -0 14976 00:05:45.633 06:15:15 -- common/autotest_common.sh@941 -- # uname 00:05:45.633 06:15:15 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:45.633 06:15:15 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 14976 00:05:45.633 06:15:15 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:45.633 06:15:15 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:45.633 06:15:15 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 14976' 00:05:45.633 killing process with pid 14976 00:05:45.633 06:15:15 -- common/autotest_common.sh@955 -- # kill 14976 00:05:45.633 06:15:15 -- common/autotest_common.sh@960 -- # wait 14976 00:05:45.894 00:05:45.894 real 0m2.345s 00:05:45.894 user 0m2.603s 00:05:45.894 sys 0m0.697s 00:05:45.894 06:15:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:45.894 06:15:15 -- common/autotest_common.sh@10 -- # set +x 00:05:45.894 ************************************ 00:05:45.894 END TEST locking_app_on_locked_coremask 00:05:45.894 ************************************ 00:05:46.153 06:15:15 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:05:46.153 06:15:15 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:46.153 06:15:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:46.153 06:15:15 -- common/autotest_common.sh@10 -- # set +x 00:05:46.153 ************************************ 00:05:46.153 START TEST locking_overlapped_coremask 00:05:46.153 ************************************ 00:05:46.153 06:15:15 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask 00:05:46.153 06:15:15 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=15548 00:05:46.153 06:15:15 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:05:46.153 06:15:15 -- event/cpu_locks.sh@133 -- # waitforlisten 15548 /var/tmp/spdk.sock 00:05:46.153 06:15:15 -- common/autotest_common.sh@829 -- # '[' -z 15548 ']' 00:05:46.153 06:15:15 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:46.153 06:15:15 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:46.153 06:15:15 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:46.153 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:46.153 06:15:15 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:46.153 06:15:15 -- common/autotest_common.sh@10 -- # set +x 00:05:46.153 [2024-11-27 06:15:15.447540] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:46.153 [2024-11-27 06:15:15.447592] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid15548 ] 00:05:46.153 EAL: No free 2048 kB hugepages reported on node 1 00:05:46.153 [2024-11-27 06:15:15.512432] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:46.153 [2024-11-27 06:15:15.589191] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:46.153 [2024-11-27 06:15:15.589329] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:46.153 [2024-11-27 06:15:15.589425] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:46.153 [2024-11-27 06:15:15.589427] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.091 06:15:16 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:47.091 06:15:16 -- common/autotest_common.sh@862 -- # return 0 00:05:47.091 06:15:16 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=15570 00:05:47.091 06:15:16 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 15570 /var/tmp/spdk2.sock 00:05:47.091 06:15:16 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:05:47.091 06:15:16 -- common/autotest_common.sh@650 -- # local es=0 00:05:47.091 06:15:16 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 15570 /var/tmp/spdk2.sock 00:05:47.091 06:15:16 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:05:47.091 06:15:16 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:47.091 06:15:16 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:05:47.091 06:15:16 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:47.091 06:15:16 -- common/autotest_common.sh@653 -- # waitforlisten 15570 /var/tmp/spdk2.sock 00:05:47.091 06:15:16 -- common/autotest_common.sh@829 -- # '[' -z 15570 ']' 00:05:47.091 06:15:16 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:47.091 06:15:16 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:47.091 06:15:16 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:47.091 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:47.091 06:15:16 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:47.091 06:15:16 -- common/autotest_common.sh@10 -- # set +x 00:05:47.091 [2024-11-27 06:15:16.319170] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:47.091 [2024-11-27 06:15:16.319257] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid15570 ] 00:05:47.091 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.091 [2024-11-27 06:15:16.412803] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 15548 has claimed it. 00:05:47.091 [2024-11-27 06:15:16.412842] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:47.661 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (15570) - No such process 00:05:47.661 ERROR: process (pid: 15570) is no longer running 00:05:47.661 06:15:16 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:47.661 06:15:16 -- common/autotest_common.sh@862 -- # return 1 00:05:47.661 06:15:16 -- common/autotest_common.sh@653 -- # es=1 00:05:47.661 06:15:16 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:47.661 06:15:16 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:47.661 06:15:16 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:47.661 06:15:16 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:05:47.661 06:15:16 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:47.661 06:15:16 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:47.661 06:15:16 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:47.661 06:15:16 -- event/cpu_locks.sh@141 -- # killprocess 15548 00:05:47.661 06:15:16 -- common/autotest_common.sh@936 -- # '[' -z 15548 ']' 00:05:47.661 06:15:16 -- common/autotest_common.sh@940 -- # kill -0 15548 00:05:47.661 06:15:16 -- common/autotest_common.sh@941 -- # uname 00:05:47.661 06:15:16 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:47.661 06:15:16 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 15548 00:05:47.661 06:15:17 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:47.661 06:15:17 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:47.661 06:15:17 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 15548' 00:05:47.661 killing process with pid 15548 00:05:47.661 06:15:17 -- common/autotest_common.sh@955 -- # kill 15548 00:05:47.661 06:15:17 -- common/autotest_common.sh@960 -- # wait 15548 00:05:47.921 00:05:47.921 real 0m1.918s 00:05:47.921 user 0m5.482s 00:05:47.921 sys 0m0.458s 00:05:47.921 06:15:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:47.921 06:15:17 -- common/autotest_common.sh@10 -- # set +x 00:05:47.921 ************************************ 00:05:47.921 END TEST locking_overlapped_coremask 00:05:47.921 ************************************ 00:05:47.921 06:15:17 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:05:47.921 06:15:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:47.921 06:15:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:47.921 06:15:17 -- common/autotest_common.sh@10 -- # set +x 00:05:47.921 ************************************ 00:05:47.921 START TEST locking_overlapped_coremask_via_rpc 00:05:47.921 ************************************ 00:05:47.921 06:15:17 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask_via_rpc 00:05:47.921 06:15:17 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=15860 00:05:47.921 06:15:17 -- event/cpu_locks.sh@149 -- # waitforlisten 15860 /var/tmp/spdk.sock 00:05:47.921 06:15:17 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:05:47.921 06:15:17 -- common/autotest_common.sh@829 -- # '[' -z 15860 ']' 00:05:47.921 06:15:17 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:47.921 06:15:17 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:47.921 06:15:17 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:47.921 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:47.921 06:15:17 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:47.921 06:15:17 -- common/autotest_common.sh@10 -- # set +x 00:05:47.921 [2024-11-27 06:15:17.433288] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:47.921 [2024-11-27 06:15:17.433379] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid15860 ] 00:05:48.180 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.180 [2024-11-27 06:15:17.506420] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:48.180 [2024-11-27 06:15:17.506448] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:48.180 [2024-11-27 06:15:17.578672] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:48.180 [2024-11-27 06:15:17.578823] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:48.180 [2024-11-27 06:15:17.578848] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:48.180 [2024-11-27 06:15:17.578853] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:48.748 06:15:18 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:48.748 06:15:18 -- common/autotest_common.sh@862 -- # return 0 00:05:48.748 06:15:18 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=16023 00:05:48.748 06:15:18 -- event/cpu_locks.sh@153 -- # waitforlisten 16023 /var/tmp/spdk2.sock 00:05:48.748 06:15:18 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:05:48.748 06:15:18 -- common/autotest_common.sh@829 -- # '[' -z 16023 ']' 00:05:48.748 06:15:18 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:48.748 06:15:18 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:48.748 06:15:18 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:48.748 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:48.748 06:15:18 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:48.748 06:15:18 -- common/autotest_common.sh@10 -- # set +x 00:05:49.007 [2024-11-27 06:15:18.289144] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:49.007 [2024-11-27 06:15:18.289220] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid16023 ] 00:05:49.007 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.007 [2024-11-27 06:15:18.385470] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:49.007 [2024-11-27 06:15:18.385503] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:49.007 [2024-11-27 06:15:18.524402] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:49.007 [2024-11-27 06:15:18.524570] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:49.007 [2024-11-27 06:15:18.527647] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:49.007 [2024-11-27 06:15:18.527649] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:05:49.948 06:15:19 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:49.948 06:15:19 -- common/autotest_common.sh@862 -- # return 0 00:05:49.948 06:15:19 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:05:49.949 06:15:19 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:49.949 06:15:19 -- common/autotest_common.sh@10 -- # set +x 00:05:49.949 06:15:19 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:49.949 06:15:19 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:49.949 06:15:19 -- common/autotest_common.sh@650 -- # local es=0 00:05:49.949 06:15:19 -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:49.949 06:15:19 -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:05:49.949 06:15:19 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:49.949 06:15:19 -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:05:49.949 06:15:19 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:49.949 06:15:19 -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:49.949 06:15:19 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:49.949 06:15:19 -- common/autotest_common.sh@10 -- # set +x 00:05:49.949 [2024-11-27 06:15:19.143657] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 15860 has claimed it. 00:05:49.949 request: 00:05:49.949 { 00:05:49.949 "method": "framework_enable_cpumask_locks", 00:05:49.949 "req_id": 1 00:05:49.949 } 00:05:49.949 Got JSON-RPC error response 00:05:49.949 response: 00:05:49.949 { 00:05:49.949 "code": -32603, 00:05:49.949 "message": "Failed to claim CPU core: 2" 00:05:49.949 } 00:05:49.949 06:15:19 -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:05:49.949 06:15:19 -- common/autotest_common.sh@653 -- # es=1 00:05:49.949 06:15:19 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:49.949 06:15:19 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:49.949 06:15:19 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:49.949 06:15:19 -- event/cpu_locks.sh@158 -- # waitforlisten 15860 /var/tmp/spdk.sock 00:05:49.949 06:15:19 -- common/autotest_common.sh@829 -- # '[' -z 15860 ']' 00:05:49.949 06:15:19 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:49.949 06:15:19 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:49.949 06:15:19 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:49.949 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:49.949 06:15:19 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:49.949 06:15:19 -- common/autotest_common.sh@10 -- # set +x 00:05:49.949 06:15:19 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:49.949 06:15:19 -- common/autotest_common.sh@862 -- # return 0 00:05:49.949 06:15:19 -- event/cpu_locks.sh@159 -- # waitforlisten 16023 /var/tmp/spdk2.sock 00:05:49.949 06:15:19 -- common/autotest_common.sh@829 -- # '[' -z 16023 ']' 00:05:49.949 06:15:19 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:49.949 06:15:19 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:49.949 06:15:19 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:49.949 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:49.949 06:15:19 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:49.949 06:15:19 -- common/autotest_common.sh@10 -- # set +x 00:05:50.209 06:15:19 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:50.209 06:15:19 -- common/autotest_common.sh@862 -- # return 0 00:05:50.209 06:15:19 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:05:50.209 06:15:19 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:50.209 06:15:19 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:50.209 06:15:19 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:50.209 00:05:50.209 real 0m2.120s 00:05:50.209 user 0m0.860s 00:05:50.209 sys 0m0.199s 00:05:50.209 06:15:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:50.209 06:15:19 -- common/autotest_common.sh@10 -- # set +x 00:05:50.209 ************************************ 00:05:50.209 END TEST locking_overlapped_coremask_via_rpc 00:05:50.209 ************************************ 00:05:50.209 06:15:19 -- event/cpu_locks.sh@174 -- # cleanup 00:05:50.209 06:15:19 -- event/cpu_locks.sh@15 -- # [[ -z 15860 ]] 00:05:50.209 06:15:19 -- event/cpu_locks.sh@15 -- # killprocess 15860 00:05:50.209 06:15:19 -- common/autotest_common.sh@936 -- # '[' -z 15860 ']' 00:05:50.209 06:15:19 -- common/autotest_common.sh@940 -- # kill -0 15860 00:05:50.209 06:15:19 -- common/autotest_common.sh@941 -- # uname 00:05:50.209 06:15:19 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:50.209 06:15:19 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 15860 00:05:50.209 06:15:19 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:50.209 06:15:19 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:50.209 06:15:19 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 15860' 00:05:50.209 killing process with pid 15860 00:05:50.209 06:15:19 -- common/autotest_common.sh@955 -- # kill 15860 00:05:50.209 06:15:19 -- common/autotest_common.sh@960 -- # wait 15860 00:05:50.469 06:15:19 -- event/cpu_locks.sh@16 -- # [[ -z 16023 ]] 00:05:50.469 06:15:19 -- event/cpu_locks.sh@16 -- # killprocess 16023 00:05:50.469 06:15:19 -- common/autotest_common.sh@936 -- # '[' -z 16023 ']' 00:05:50.469 06:15:19 -- common/autotest_common.sh@940 -- # kill -0 16023 00:05:50.469 06:15:19 -- common/autotest_common.sh@941 -- # uname 00:05:50.469 06:15:19 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:50.469 06:15:19 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 16023 00:05:50.728 06:15:20 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:50.728 06:15:20 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:50.728 06:15:20 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 16023' 00:05:50.728 killing process with pid 16023 00:05:50.728 06:15:20 -- common/autotest_common.sh@955 -- # kill 16023 00:05:50.728 06:15:20 -- common/autotest_common.sh@960 -- # wait 16023 00:05:50.988 06:15:20 -- event/cpu_locks.sh@18 -- # rm -f 00:05:50.988 06:15:20 -- event/cpu_locks.sh@1 -- # cleanup 00:05:50.988 06:15:20 -- event/cpu_locks.sh@15 -- # [[ -z 15860 ]] 00:05:50.988 06:15:20 -- event/cpu_locks.sh@15 -- # killprocess 15860 00:05:50.988 06:15:20 -- common/autotest_common.sh@936 -- # '[' -z 15860 ']' 00:05:50.988 06:15:20 -- common/autotest_common.sh@940 -- # kill -0 15860 00:05:50.988 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (15860) - No such process 00:05:50.988 06:15:20 -- common/autotest_common.sh@963 -- # echo 'Process with pid 15860 is not found' 00:05:50.988 Process with pid 15860 is not found 00:05:50.988 06:15:20 -- event/cpu_locks.sh@16 -- # [[ -z 16023 ]] 00:05:50.988 06:15:20 -- event/cpu_locks.sh@16 -- # killprocess 16023 00:05:50.988 06:15:20 -- common/autotest_common.sh@936 -- # '[' -z 16023 ']' 00:05:50.988 06:15:20 -- common/autotest_common.sh@940 -- # kill -0 16023 00:05:50.988 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (16023) - No such process 00:05:50.988 06:15:20 -- common/autotest_common.sh@963 -- # echo 'Process with pid 16023 is not found' 00:05:50.988 Process with pid 16023 is not found 00:05:50.988 06:15:20 -- event/cpu_locks.sh@18 -- # rm -f 00:05:50.988 00:05:50.988 real 0m18.330s 00:05:50.988 user 0m31.132s 00:05:50.988 sys 0m5.840s 00:05:50.988 06:15:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:50.988 06:15:20 -- common/autotest_common.sh@10 -- # set +x 00:05:50.988 ************************************ 00:05:50.988 END TEST cpu_locks 00:05:50.988 ************************************ 00:05:50.988 00:05:50.988 real 0m44.040s 00:05:50.988 user 1m23.320s 00:05:50.988 sys 0m9.956s 00:05:50.988 06:15:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:50.988 06:15:20 -- common/autotest_common.sh@10 -- # set +x 00:05:50.988 ************************************ 00:05:50.988 END TEST event 00:05:50.988 ************************************ 00:05:50.988 06:15:20 -- spdk/autotest.sh@175 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:05:50.988 06:15:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:50.988 06:15:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:50.988 06:15:20 -- common/autotest_common.sh@10 -- # set +x 00:05:50.988 ************************************ 00:05:50.988 START TEST thread 00:05:50.988 ************************************ 00:05:50.988 06:15:20 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:05:50.988 * Looking for test storage... 00:05:50.988 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:05:50.988 06:15:20 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:50.988 06:15:20 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:50.988 06:15:20 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:51.249 06:15:20 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:51.249 06:15:20 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:51.249 06:15:20 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:51.249 06:15:20 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:51.249 06:15:20 -- scripts/common.sh@335 -- # IFS=.-: 00:05:51.249 06:15:20 -- scripts/common.sh@335 -- # read -ra ver1 00:05:51.249 06:15:20 -- scripts/common.sh@336 -- # IFS=.-: 00:05:51.249 06:15:20 -- scripts/common.sh@336 -- # read -ra ver2 00:05:51.249 06:15:20 -- scripts/common.sh@337 -- # local 'op=<' 00:05:51.249 06:15:20 -- scripts/common.sh@339 -- # ver1_l=2 00:05:51.249 06:15:20 -- scripts/common.sh@340 -- # ver2_l=1 00:05:51.249 06:15:20 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:51.249 06:15:20 -- scripts/common.sh@343 -- # case "$op" in 00:05:51.249 06:15:20 -- scripts/common.sh@344 -- # : 1 00:05:51.249 06:15:20 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:51.249 06:15:20 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:51.249 06:15:20 -- scripts/common.sh@364 -- # decimal 1 00:05:51.249 06:15:20 -- scripts/common.sh@352 -- # local d=1 00:05:51.249 06:15:20 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:51.249 06:15:20 -- scripts/common.sh@354 -- # echo 1 00:05:51.249 06:15:20 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:51.249 06:15:20 -- scripts/common.sh@365 -- # decimal 2 00:05:51.249 06:15:20 -- scripts/common.sh@352 -- # local d=2 00:05:51.249 06:15:20 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:51.249 06:15:20 -- scripts/common.sh@354 -- # echo 2 00:05:51.249 06:15:20 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:51.249 06:15:20 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:51.249 06:15:20 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:51.249 06:15:20 -- scripts/common.sh@367 -- # return 0 00:05:51.249 06:15:20 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:51.249 06:15:20 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:51.249 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:51.249 --rc genhtml_branch_coverage=1 00:05:51.249 --rc genhtml_function_coverage=1 00:05:51.249 --rc genhtml_legend=1 00:05:51.249 --rc geninfo_all_blocks=1 00:05:51.249 --rc geninfo_unexecuted_blocks=1 00:05:51.249 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:51.249 ' 00:05:51.249 06:15:20 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:51.249 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:51.249 --rc genhtml_branch_coverage=1 00:05:51.249 --rc genhtml_function_coverage=1 00:05:51.249 --rc genhtml_legend=1 00:05:51.249 --rc geninfo_all_blocks=1 00:05:51.249 --rc geninfo_unexecuted_blocks=1 00:05:51.249 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:51.249 ' 00:05:51.249 06:15:20 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:51.249 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:51.249 --rc genhtml_branch_coverage=1 00:05:51.249 --rc genhtml_function_coverage=1 00:05:51.249 --rc genhtml_legend=1 00:05:51.249 --rc geninfo_all_blocks=1 00:05:51.249 --rc geninfo_unexecuted_blocks=1 00:05:51.249 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:51.249 ' 00:05:51.249 06:15:20 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:51.249 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:51.249 --rc genhtml_branch_coverage=1 00:05:51.249 --rc genhtml_function_coverage=1 00:05:51.249 --rc genhtml_legend=1 00:05:51.249 --rc geninfo_all_blocks=1 00:05:51.249 --rc geninfo_unexecuted_blocks=1 00:05:51.249 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:51.249 ' 00:05:51.249 06:15:20 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:51.249 06:15:20 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:05:51.249 06:15:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.249 06:15:20 -- common/autotest_common.sh@10 -- # set +x 00:05:51.249 ************************************ 00:05:51.249 START TEST thread_poller_perf 00:05:51.249 ************************************ 00:05:51.249 06:15:20 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:51.249 [2024-11-27 06:15:20.626535] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:51.249 [2024-11-27 06:15:20.626631] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid16508 ] 00:05:51.249 EAL: No free 2048 kB hugepages reported on node 1 00:05:51.249 [2024-11-27 06:15:20.698126] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:51.249 [2024-11-27 06:15:20.767660] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.249 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:05:52.628 [2024-11-27T05:15:22.164Z] ====================================== 00:05:52.628 [2024-11-27T05:15:22.164Z] busy:2503757456 (cyc) 00:05:52.628 [2024-11-27T05:15:22.164Z] total_run_count: 793000 00:05:52.628 [2024-11-27T05:15:22.164Z] tsc_hz: 2500000000 (cyc) 00:05:52.629 [2024-11-27T05:15:22.165Z] ====================================== 00:05:52.629 [2024-11-27T05:15:22.165Z] poller_cost: 3157 (cyc), 1262 (nsec) 00:05:52.629 00:05:52.629 real 0m1.228s 00:05:52.629 user 0m1.139s 00:05:52.629 sys 0m0.084s 00:05:52.629 06:15:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:52.629 06:15:21 -- common/autotest_common.sh@10 -- # set +x 00:05:52.629 ************************************ 00:05:52.629 END TEST thread_poller_perf 00:05:52.629 ************************************ 00:05:52.629 06:15:21 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:52.629 06:15:21 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:05:52.629 06:15:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:52.629 06:15:21 -- common/autotest_common.sh@10 -- # set +x 00:05:52.629 ************************************ 00:05:52.629 START TEST thread_poller_perf 00:05:52.629 ************************************ 00:05:52.629 06:15:21 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:52.629 [2024-11-27 06:15:21.903139] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:52.629 [2024-11-27 06:15:21.903231] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid16794 ] 00:05:52.629 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.629 [2024-11-27 06:15:21.973047] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:52.629 [2024-11-27 06:15:22.040023] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.629 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:05:54.008 [2024-11-27T05:15:23.544Z] ====================================== 00:05:54.008 [2024-11-27T05:15:23.544Z] busy:2502112680 (cyc) 00:05:54.008 [2024-11-27T05:15:23.544Z] total_run_count: 13309000 00:05:54.008 [2024-11-27T05:15:23.544Z] tsc_hz: 2500000000 (cyc) 00:05:54.008 [2024-11-27T05:15:23.544Z] ====================================== 00:05:54.008 [2024-11-27T05:15:23.544Z] poller_cost: 188 (cyc), 75 (nsec) 00:05:54.008 00:05:54.008 real 0m1.221s 00:05:54.008 user 0m1.129s 00:05:54.008 sys 0m0.087s 00:05:54.008 06:15:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:54.008 06:15:23 -- common/autotest_common.sh@10 -- # set +x 00:05:54.008 ************************************ 00:05:54.008 END TEST thread_poller_perf 00:05:54.008 ************************************ 00:05:54.008 06:15:23 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:05:54.008 06:15:23 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:05:54.008 06:15:23 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:54.008 06:15:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:54.008 06:15:23 -- common/autotest_common.sh@10 -- # set +x 00:05:54.008 ************************************ 00:05:54.008 START TEST thread_spdk_lock 00:05:54.008 ************************************ 00:05:54.008 06:15:23 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:05:54.008 [2024-11-27 06:15:23.176611] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:54.008 [2024-11-27 06:15:23.176704] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid17079 ] 00:05:54.008 EAL: No free 2048 kB hugepages reported on node 1 00:05:54.008 [2024-11-27 06:15:23.247808] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:54.008 [2024-11-27 06:15:23.316308] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:54.008 [2024-11-27 06:15:23.316310] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:54.268 [2024-11-27 06:15:23.800285] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 957:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:54.268 [2024-11-27 06:15:23.800321] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3064:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:05:54.268 [2024-11-27 06:15:23.800331] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3019:sspin_stacks_print: *ERROR*: spinlock 0x1483c80 00:05:54.268 [2024-11-27 06:15:23.801248] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:54.268 [2024-11-27 06:15:23.801353] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1018:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:54.268 [2024-11-27 06:15:23.801372] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:54.548 Starting test contend 00:05:54.548 Worker Delay Wait us Hold us Total us 00:05:54.548 0 3 168166 183160 351326 00:05:54.548 1 5 84395 283898 368293 00:05:54.548 PASS test contend 00:05:54.548 Starting test hold_by_poller 00:05:54.548 PASS test hold_by_poller 00:05:54.548 Starting test hold_by_message 00:05:54.548 PASS test hold_by_message 00:05:54.548 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:05:54.548 100014 assertions passed 00:05:54.548 0 assertions failed 00:05:54.548 00:05:54.548 real 0m0.703s 00:05:54.548 user 0m1.101s 00:05:54.548 sys 0m0.085s 00:05:54.548 06:15:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:54.548 06:15:23 -- common/autotest_common.sh@10 -- # set +x 00:05:54.548 ************************************ 00:05:54.548 END TEST thread_spdk_lock 00:05:54.548 ************************************ 00:05:54.548 00:05:54.548 real 0m3.493s 00:05:54.548 user 0m3.512s 00:05:54.548 sys 0m0.498s 00:05:54.548 06:15:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:54.548 06:15:23 -- common/autotest_common.sh@10 -- # set +x 00:05:54.548 ************************************ 00:05:54.548 END TEST thread 00:05:54.548 ************************************ 00:05:54.548 06:15:23 -- spdk/autotest.sh@176 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:05:54.548 06:15:23 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:54.548 06:15:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:54.548 06:15:23 -- common/autotest_common.sh@10 -- # set +x 00:05:54.548 ************************************ 00:05:54.548 START TEST accel 00:05:54.548 ************************************ 00:05:54.548 06:15:23 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:05:54.548 * Looking for test storage... 00:05:54.548 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:05:54.548 06:15:24 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:54.548 06:15:24 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:54.548 06:15:24 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:54.890 06:15:24 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:54.890 06:15:24 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:54.890 06:15:24 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:54.890 06:15:24 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:54.890 06:15:24 -- scripts/common.sh@335 -- # IFS=.-: 00:05:54.890 06:15:24 -- scripts/common.sh@335 -- # read -ra ver1 00:05:54.890 06:15:24 -- scripts/common.sh@336 -- # IFS=.-: 00:05:54.890 06:15:24 -- scripts/common.sh@336 -- # read -ra ver2 00:05:54.890 06:15:24 -- scripts/common.sh@337 -- # local 'op=<' 00:05:54.890 06:15:24 -- scripts/common.sh@339 -- # ver1_l=2 00:05:54.890 06:15:24 -- scripts/common.sh@340 -- # ver2_l=1 00:05:54.890 06:15:24 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:54.890 06:15:24 -- scripts/common.sh@343 -- # case "$op" in 00:05:54.890 06:15:24 -- scripts/common.sh@344 -- # : 1 00:05:54.890 06:15:24 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:54.890 06:15:24 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:54.890 06:15:24 -- scripts/common.sh@364 -- # decimal 1 00:05:54.890 06:15:24 -- scripts/common.sh@352 -- # local d=1 00:05:54.890 06:15:24 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:54.890 06:15:24 -- scripts/common.sh@354 -- # echo 1 00:05:54.890 06:15:24 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:54.890 06:15:24 -- scripts/common.sh@365 -- # decimal 2 00:05:54.890 06:15:24 -- scripts/common.sh@352 -- # local d=2 00:05:54.890 06:15:24 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:54.890 06:15:24 -- scripts/common.sh@354 -- # echo 2 00:05:54.890 06:15:24 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:54.890 06:15:24 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:54.890 06:15:24 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:54.890 06:15:24 -- scripts/common.sh@367 -- # return 0 00:05:54.890 06:15:24 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:54.890 06:15:24 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:54.890 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:54.890 --rc genhtml_branch_coverage=1 00:05:54.890 --rc genhtml_function_coverage=1 00:05:54.890 --rc genhtml_legend=1 00:05:54.890 --rc geninfo_all_blocks=1 00:05:54.890 --rc geninfo_unexecuted_blocks=1 00:05:54.890 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:54.890 ' 00:05:54.890 06:15:24 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:54.890 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:54.890 --rc genhtml_branch_coverage=1 00:05:54.890 --rc genhtml_function_coverage=1 00:05:54.890 --rc genhtml_legend=1 00:05:54.890 --rc geninfo_all_blocks=1 00:05:54.890 --rc geninfo_unexecuted_blocks=1 00:05:54.890 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:54.890 ' 00:05:54.890 06:15:24 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:54.890 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:54.890 --rc genhtml_branch_coverage=1 00:05:54.890 --rc genhtml_function_coverage=1 00:05:54.890 --rc genhtml_legend=1 00:05:54.890 --rc geninfo_all_blocks=1 00:05:54.890 --rc geninfo_unexecuted_blocks=1 00:05:54.890 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:54.890 ' 00:05:54.890 06:15:24 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:54.890 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:54.890 --rc genhtml_branch_coverage=1 00:05:54.890 --rc genhtml_function_coverage=1 00:05:54.890 --rc genhtml_legend=1 00:05:54.890 --rc geninfo_all_blocks=1 00:05:54.890 --rc geninfo_unexecuted_blocks=1 00:05:54.890 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:54.890 ' 00:05:54.890 06:15:24 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:05:54.890 06:15:24 -- accel/accel.sh@74 -- # get_expected_opcs 00:05:54.890 06:15:24 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:54.890 06:15:24 -- accel/accel.sh@59 -- # spdk_tgt_pid=17188 00:05:54.890 06:15:24 -- accel/accel.sh@60 -- # waitforlisten 17188 00:05:54.890 06:15:24 -- common/autotest_common.sh@829 -- # '[' -z 17188 ']' 00:05:54.890 06:15:24 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:54.890 06:15:24 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:54.890 06:15:24 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:54.890 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:54.890 06:15:24 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:54.890 06:15:24 -- common/autotest_common.sh@10 -- # set +x 00:05:54.890 06:15:24 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:05:54.891 06:15:24 -- accel/accel.sh@58 -- # build_accel_config 00:05:54.891 06:15:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:54.891 06:15:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:54.891 06:15:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:54.891 06:15:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:54.891 06:15:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:54.891 06:15:24 -- accel/accel.sh@41 -- # local IFS=, 00:05:54.891 06:15:24 -- accel/accel.sh@42 -- # jq -r . 00:05:54.891 [2024-11-27 06:15:24.152422] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:54.891 [2024-11-27 06:15:24.152508] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid17188 ] 00:05:54.891 EAL: No free 2048 kB hugepages reported on node 1 00:05:54.891 [2024-11-27 06:15:24.220854] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:54.891 [2024-11-27 06:15:24.295157] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:54.891 [2024-11-27 06:15:24.295268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.502 06:15:24 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:55.502 06:15:24 -- common/autotest_common.sh@862 -- # return 0 00:05:55.502 06:15:24 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:05:55.502 06:15:24 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:05:55.502 06:15:24 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:05:55.502 06:15:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:55.502 06:15:24 -- common/autotest_common.sh@10 -- # set +x 00:05:55.502 06:15:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # IFS== 00:05:55.502 06:15:25 -- accel/accel.sh@64 -- # read -r opc module 00:05:55.502 06:15:25 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:55.502 06:15:25 -- accel/accel.sh@67 -- # killprocess 17188 00:05:55.502 06:15:25 -- common/autotest_common.sh@936 -- # '[' -z 17188 ']' 00:05:55.502 06:15:25 -- common/autotest_common.sh@940 -- # kill -0 17188 00:05:55.502 06:15:25 -- common/autotest_common.sh@941 -- # uname 00:05:55.502 06:15:25 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:55.502 06:15:25 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 17188 00:05:55.762 06:15:25 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:55.762 06:15:25 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:55.762 06:15:25 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 17188' 00:05:55.762 killing process with pid 17188 00:05:55.762 06:15:25 -- common/autotest_common.sh@955 -- # kill 17188 00:05:55.762 06:15:25 -- common/autotest_common.sh@960 -- # wait 17188 00:05:56.022 06:15:25 -- accel/accel.sh@68 -- # trap - ERR 00:05:56.022 06:15:25 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:05:56.022 06:15:25 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:05:56.022 06:15:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:56.022 06:15:25 -- common/autotest_common.sh@10 -- # set +x 00:05:56.022 06:15:25 -- common/autotest_common.sh@1114 -- # accel_perf -h 00:05:56.022 06:15:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:05:56.022 06:15:25 -- accel/accel.sh@12 -- # build_accel_config 00:05:56.022 06:15:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:56.022 06:15:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:56.022 06:15:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:56.022 06:15:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:56.022 06:15:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:56.022 06:15:25 -- accel/accel.sh@41 -- # local IFS=, 00:05:56.022 06:15:25 -- accel/accel.sh@42 -- # jq -r . 00:05:56.022 06:15:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:56.022 06:15:25 -- common/autotest_common.sh@10 -- # set +x 00:05:56.022 06:15:25 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:05:56.022 06:15:25 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:56.022 06:15:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:56.022 06:15:25 -- common/autotest_common.sh@10 -- # set +x 00:05:56.022 ************************************ 00:05:56.022 START TEST accel_missing_filename 00:05:56.022 ************************************ 00:05:56.022 06:15:25 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress 00:05:56.022 06:15:25 -- common/autotest_common.sh@650 -- # local es=0 00:05:56.022 06:15:25 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:05:56.022 06:15:25 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:05:56.022 06:15:25 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:56.022 06:15:25 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:05:56.022 06:15:25 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:56.022 06:15:25 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:05:56.022 06:15:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:05:56.022 06:15:25 -- accel/accel.sh@12 -- # build_accel_config 00:05:56.022 06:15:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:56.022 06:15:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:56.022 06:15:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:56.022 06:15:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:56.022 06:15:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:56.022 06:15:25 -- accel/accel.sh@41 -- # local IFS=, 00:05:56.022 06:15:25 -- accel/accel.sh@42 -- # jq -r . 00:05:56.022 [2024-11-27 06:15:25.485234] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:56.022 [2024-11-27 06:15:25.485327] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid17467 ] 00:05:56.022 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.022 [2024-11-27 06:15:25.556682] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:56.282 [2024-11-27 06:15:25.624923] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.282 [2024-11-27 06:15:25.664782] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:56.282 [2024-11-27 06:15:25.725052] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:05:56.282 A filename is required. 00:05:56.282 06:15:25 -- common/autotest_common.sh@653 -- # es=234 00:05:56.282 06:15:25 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:56.282 06:15:25 -- common/autotest_common.sh@662 -- # es=106 00:05:56.282 06:15:25 -- common/autotest_common.sh@663 -- # case "$es" in 00:05:56.282 06:15:25 -- common/autotest_common.sh@670 -- # es=1 00:05:56.282 06:15:25 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:56.282 00:05:56.282 real 0m0.331s 00:05:56.282 user 0m0.234s 00:05:56.282 sys 0m0.134s 00:05:56.282 06:15:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:56.282 06:15:25 -- common/autotest_common.sh@10 -- # set +x 00:05:56.282 ************************************ 00:05:56.282 END TEST accel_missing_filename 00:05:56.282 ************************************ 00:05:56.541 06:15:25 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:56.541 06:15:25 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:05:56.541 06:15:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:56.541 06:15:25 -- common/autotest_common.sh@10 -- # set +x 00:05:56.541 ************************************ 00:05:56.541 START TEST accel_compress_verify 00:05:56.541 ************************************ 00:05:56.541 06:15:25 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:56.541 06:15:25 -- common/autotest_common.sh@650 -- # local es=0 00:05:56.541 06:15:25 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:56.541 06:15:25 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:05:56.541 06:15:25 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:56.541 06:15:25 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:05:56.541 06:15:25 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:56.541 06:15:25 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:56.541 06:15:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:56.541 06:15:25 -- accel/accel.sh@12 -- # build_accel_config 00:05:56.541 06:15:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:56.541 06:15:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:56.541 06:15:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:56.541 06:15:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:56.541 06:15:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:56.541 06:15:25 -- accel/accel.sh@41 -- # local IFS=, 00:05:56.541 06:15:25 -- accel/accel.sh@42 -- # jq -r . 00:05:56.541 [2024-11-27 06:15:25.859711] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:56.541 [2024-11-27 06:15:25.859801] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid17609 ] 00:05:56.541 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.541 [2024-11-27 06:15:25.933768] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:56.541 [2024-11-27 06:15:26.002432] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.541 [2024-11-27 06:15:26.042229] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:56.800 [2024-11-27 06:15:26.102468] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:05:56.800 00:05:56.800 Compression does not support the verify option, aborting. 00:05:56.800 06:15:26 -- common/autotest_common.sh@653 -- # es=161 00:05:56.800 06:15:26 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:56.801 06:15:26 -- common/autotest_common.sh@662 -- # es=33 00:05:56.801 06:15:26 -- common/autotest_common.sh@663 -- # case "$es" in 00:05:56.801 06:15:26 -- common/autotest_common.sh@670 -- # es=1 00:05:56.801 06:15:26 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:56.801 00:05:56.801 real 0m0.332s 00:05:56.801 user 0m0.236s 00:05:56.801 sys 0m0.135s 00:05:56.801 06:15:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:56.801 06:15:26 -- common/autotest_common.sh@10 -- # set +x 00:05:56.801 ************************************ 00:05:56.801 END TEST accel_compress_verify 00:05:56.801 ************************************ 00:05:56.801 06:15:26 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:05:56.801 06:15:26 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:56.801 06:15:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:56.801 06:15:26 -- common/autotest_common.sh@10 -- # set +x 00:05:56.801 ************************************ 00:05:56.801 START TEST accel_wrong_workload 00:05:56.801 ************************************ 00:05:56.801 06:15:26 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w foobar 00:05:56.801 06:15:26 -- common/autotest_common.sh@650 -- # local es=0 00:05:56.801 06:15:26 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:05:56.801 06:15:26 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:05:56.801 06:15:26 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:56.801 06:15:26 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:05:56.801 06:15:26 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:56.801 06:15:26 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:05:56.801 06:15:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:05:56.801 06:15:26 -- accel/accel.sh@12 -- # build_accel_config 00:05:56.801 06:15:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:56.801 06:15:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:56.801 06:15:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:56.801 06:15:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:56.801 06:15:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:56.801 06:15:26 -- accel/accel.sh@41 -- # local IFS=, 00:05:56.801 06:15:26 -- accel/accel.sh@42 -- # jq -r . 00:05:56.801 Unsupported workload type: foobar 00:05:56.801 [2024-11-27 06:15:26.232969] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:05:56.801 accel_perf options: 00:05:56.801 [-h help message] 00:05:56.801 [-q queue depth per core] 00:05:56.801 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:05:56.801 [-T number of threads per core 00:05:56.801 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:05:56.801 [-t time in seconds] 00:05:56.801 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:05:56.801 [ dif_verify, , dif_generate, dif_generate_copy 00:05:56.801 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:05:56.801 [-l for compress/decompress workloads, name of uncompressed input file 00:05:56.801 [-S for crc32c workload, use this seed value (default 0) 00:05:56.801 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:05:56.801 [-f for fill workload, use this BYTE value (default 255) 00:05:56.801 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:05:56.801 [-y verify result if this switch is on] 00:05:56.801 [-a tasks to allocate per core (default: same value as -q)] 00:05:56.801 Can be used to spread operations across a wider range of memory. 00:05:56.801 06:15:26 -- common/autotest_common.sh@653 -- # es=1 00:05:56.801 06:15:26 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:56.801 06:15:26 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:56.801 06:15:26 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:56.801 00:05:56.801 real 0m0.026s 00:05:56.801 user 0m0.012s 00:05:56.801 sys 0m0.014s 00:05:56.801 06:15:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:56.801 06:15:26 -- common/autotest_common.sh@10 -- # set +x 00:05:56.801 ************************************ 00:05:56.801 END TEST accel_wrong_workload 00:05:56.801 ************************************ 00:05:56.801 Error: writing output failed: Broken pipe 00:05:56.801 06:15:26 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:05:56.801 06:15:26 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:05:56.801 06:15:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:56.801 06:15:26 -- common/autotest_common.sh@10 -- # set +x 00:05:56.801 ************************************ 00:05:56.801 START TEST accel_negative_buffers 00:05:56.801 ************************************ 00:05:56.801 06:15:26 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:05:56.801 06:15:26 -- common/autotest_common.sh@650 -- # local es=0 00:05:56.801 06:15:26 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:05:56.801 06:15:26 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:05:56.801 06:15:26 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:56.801 06:15:26 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:05:56.801 06:15:26 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:56.801 06:15:26 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:05:56.801 06:15:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:05:56.801 06:15:26 -- accel/accel.sh@12 -- # build_accel_config 00:05:56.801 06:15:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:56.801 06:15:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:56.801 06:15:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:56.801 06:15:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:56.801 06:15:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:56.801 06:15:26 -- accel/accel.sh@41 -- # local IFS=, 00:05:56.801 06:15:26 -- accel/accel.sh@42 -- # jq -r . 00:05:56.801 -x option must be non-negative. 00:05:56.801 [2024-11-27 06:15:26.299563] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:05:56.801 accel_perf options: 00:05:56.801 [-h help message] 00:05:56.801 [-q queue depth per core] 00:05:56.801 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:05:56.801 [-T number of threads per core 00:05:56.801 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:05:56.801 [-t time in seconds] 00:05:56.801 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:05:56.801 [ dif_verify, , dif_generate, dif_generate_copy 00:05:56.801 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:05:56.801 [-l for compress/decompress workloads, name of uncompressed input file 00:05:56.801 [-S for crc32c workload, use this seed value (default 0) 00:05:56.801 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:05:56.801 [-f for fill workload, use this BYTE value (default 255) 00:05:56.801 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:05:56.801 [-y verify result if this switch is on] 00:05:56.801 [-a tasks to allocate per core (default: same value as -q)] 00:05:56.801 Can be used to spread operations across a wider range of memory. 00:05:56.801 06:15:26 -- common/autotest_common.sh@653 -- # es=1 00:05:56.801 06:15:26 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:56.801 06:15:26 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:56.801 06:15:26 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:56.801 00:05:56.801 real 0m0.025s 00:05:56.801 user 0m0.011s 00:05:56.801 sys 0m0.014s 00:05:56.801 06:15:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:56.801 06:15:26 -- common/autotest_common.sh@10 -- # set +x 00:05:56.801 ************************************ 00:05:56.801 END TEST accel_negative_buffers 00:05:56.801 ************************************ 00:05:56.801 Error: writing output failed: Broken pipe 00:05:57.061 06:15:26 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:05:57.061 06:15:26 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:05:57.061 06:15:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:57.061 06:15:26 -- common/autotest_common.sh@10 -- # set +x 00:05:57.061 ************************************ 00:05:57.061 START TEST accel_crc32c 00:05:57.061 ************************************ 00:05:57.061 06:15:26 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -S 32 -y 00:05:57.061 06:15:26 -- accel/accel.sh@16 -- # local accel_opc 00:05:57.061 06:15:26 -- accel/accel.sh@17 -- # local accel_module 00:05:57.061 06:15:26 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:05:57.061 06:15:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:05:57.061 06:15:26 -- accel/accel.sh@12 -- # build_accel_config 00:05:57.061 06:15:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:57.061 06:15:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:57.061 06:15:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:57.061 06:15:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:57.061 06:15:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:57.061 06:15:26 -- accel/accel.sh@41 -- # local IFS=, 00:05:57.061 06:15:26 -- accel/accel.sh@42 -- # jq -r . 00:05:57.061 [2024-11-27 06:15:26.364967] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:57.061 [2024-11-27 06:15:26.365036] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid17796 ] 00:05:57.061 EAL: No free 2048 kB hugepages reported on node 1 00:05:57.061 [2024-11-27 06:15:26.433655] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:57.061 [2024-11-27 06:15:26.503691] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.442 06:15:27 -- accel/accel.sh@18 -- # out=' 00:05:58.442 SPDK Configuration: 00:05:58.442 Core mask: 0x1 00:05:58.442 00:05:58.442 Accel Perf Configuration: 00:05:58.442 Workload Type: crc32c 00:05:58.442 CRC-32C seed: 32 00:05:58.442 Transfer size: 4096 bytes 00:05:58.442 Vector count 1 00:05:58.442 Module: software 00:05:58.442 Queue depth: 32 00:05:58.442 Allocate depth: 32 00:05:58.442 # threads/core: 1 00:05:58.442 Run time: 1 seconds 00:05:58.442 Verify: Yes 00:05:58.442 00:05:58.442 Running for 1 seconds... 00:05:58.442 00:05:58.442 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:58.442 ------------------------------------------------------------------------------------ 00:05:58.442 0,0 852032/s 3328 MiB/s 0 0 00:05:58.442 ==================================================================================== 00:05:58.442 Total 852032/s 3328 MiB/s 0 0' 00:05:58.442 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.442 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.442 06:15:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:05:58.442 06:15:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:05:58.442 06:15:27 -- accel/accel.sh@12 -- # build_accel_config 00:05:58.442 06:15:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:58.442 06:15:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:58.442 06:15:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:58.442 06:15:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:58.442 06:15:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:58.442 06:15:27 -- accel/accel.sh@41 -- # local IFS=, 00:05:58.442 06:15:27 -- accel/accel.sh@42 -- # jq -r . 00:05:58.442 [2024-11-27 06:15:27.692864] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:58.442 [2024-11-27 06:15:27.692955] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid18073 ] 00:05:58.442 EAL: No free 2048 kB hugepages reported on node 1 00:05:58.442 [2024-11-27 06:15:27.762625] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:58.442 [2024-11-27 06:15:27.828670] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.442 06:15:27 -- accel/accel.sh@21 -- # val= 00:05:58.442 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.442 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.442 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.442 06:15:27 -- accel/accel.sh@21 -- # val= 00:05:58.442 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.442 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.442 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.442 06:15:27 -- accel/accel.sh@21 -- # val=0x1 00:05:58.442 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.442 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.442 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.442 06:15:27 -- accel/accel.sh@21 -- # val= 00:05:58.442 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.442 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.442 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.442 06:15:27 -- accel/accel.sh@21 -- # val= 00:05:58.442 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.442 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.442 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.443 06:15:27 -- accel/accel.sh@21 -- # val=crc32c 00:05:58.443 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.443 06:15:27 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.443 06:15:27 -- accel/accel.sh@21 -- # val=32 00:05:58.443 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.443 06:15:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:58.443 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.443 06:15:27 -- accel/accel.sh@21 -- # val= 00:05:58.443 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.443 06:15:27 -- accel/accel.sh@21 -- # val=software 00:05:58.443 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.443 06:15:27 -- accel/accel.sh@23 -- # accel_module=software 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.443 06:15:27 -- accel/accel.sh@21 -- # val=32 00:05:58.443 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.443 06:15:27 -- accel/accel.sh@21 -- # val=32 00:05:58.443 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.443 06:15:27 -- accel/accel.sh@21 -- # val=1 00:05:58.443 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.443 06:15:27 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:58.443 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.443 06:15:27 -- accel/accel.sh@21 -- # val=Yes 00:05:58.443 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.443 06:15:27 -- accel/accel.sh@21 -- # val= 00:05:58.443 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.443 06:15:27 -- accel/accel.sh@21 -- # val= 00:05:58.443 06:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.443 06:15:27 -- accel/accel.sh@20 -- # read -r var val 00:05:59.823 06:15:28 -- accel/accel.sh@21 -- # val= 00:05:59.823 06:15:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.823 06:15:28 -- accel/accel.sh@20 -- # IFS=: 00:05:59.823 06:15:28 -- accel/accel.sh@20 -- # read -r var val 00:05:59.823 06:15:28 -- accel/accel.sh@21 -- # val= 00:05:59.823 06:15:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.823 06:15:28 -- accel/accel.sh@20 -- # IFS=: 00:05:59.823 06:15:28 -- accel/accel.sh@20 -- # read -r var val 00:05:59.823 06:15:28 -- accel/accel.sh@21 -- # val= 00:05:59.823 06:15:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.823 06:15:28 -- accel/accel.sh@20 -- # IFS=: 00:05:59.823 06:15:28 -- accel/accel.sh@20 -- # read -r var val 00:05:59.823 06:15:28 -- accel/accel.sh@21 -- # val= 00:05:59.823 06:15:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.823 06:15:28 -- accel/accel.sh@20 -- # IFS=: 00:05:59.823 06:15:28 -- accel/accel.sh@20 -- # read -r var val 00:05:59.823 06:15:28 -- accel/accel.sh@21 -- # val= 00:05:59.823 06:15:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.823 06:15:28 -- accel/accel.sh@20 -- # IFS=: 00:05:59.823 06:15:28 -- accel/accel.sh@20 -- # read -r var val 00:05:59.823 06:15:28 -- accel/accel.sh@21 -- # val= 00:05:59.823 06:15:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.823 06:15:28 -- accel/accel.sh@20 -- # IFS=: 00:05:59.823 06:15:28 -- accel/accel.sh@20 -- # read -r var val 00:05:59.823 06:15:28 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:59.823 06:15:28 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:05:59.823 06:15:28 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:59.823 00:05:59.823 real 0m2.654s 00:05:59.823 user 0m2.398s 00:05:59.823 sys 0m0.256s 00:05:59.823 06:15:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:59.823 06:15:28 -- common/autotest_common.sh@10 -- # set +x 00:05:59.823 ************************************ 00:05:59.823 END TEST accel_crc32c 00:05:59.823 ************************************ 00:05:59.823 06:15:29 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:05:59.823 06:15:29 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:05:59.823 06:15:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:59.823 06:15:29 -- common/autotest_common.sh@10 -- # set +x 00:05:59.823 ************************************ 00:05:59.823 START TEST accel_crc32c_C2 00:05:59.823 ************************************ 00:05:59.823 06:15:29 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -y -C 2 00:05:59.823 06:15:29 -- accel/accel.sh@16 -- # local accel_opc 00:05:59.823 06:15:29 -- accel/accel.sh@17 -- # local accel_module 00:05:59.823 06:15:29 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:05:59.823 06:15:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:05:59.823 06:15:29 -- accel/accel.sh@12 -- # build_accel_config 00:05:59.823 06:15:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:59.823 06:15:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:59.823 06:15:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:59.823 06:15:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:59.823 06:15:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:59.823 06:15:29 -- accel/accel.sh@41 -- # local IFS=, 00:05:59.823 06:15:29 -- accel/accel.sh@42 -- # jq -r . 00:05:59.823 [2024-11-27 06:15:29.064405] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:59.823 [2024-11-27 06:15:29.064513] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid18305 ] 00:05:59.823 EAL: No free 2048 kB hugepages reported on node 1 00:05:59.823 [2024-11-27 06:15:29.133104] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:59.823 [2024-11-27 06:15:29.200633] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.202 06:15:30 -- accel/accel.sh@18 -- # out=' 00:06:01.202 SPDK Configuration: 00:06:01.202 Core mask: 0x1 00:06:01.202 00:06:01.202 Accel Perf Configuration: 00:06:01.202 Workload Type: crc32c 00:06:01.202 CRC-32C seed: 0 00:06:01.202 Transfer size: 4096 bytes 00:06:01.202 Vector count 2 00:06:01.202 Module: software 00:06:01.202 Queue depth: 32 00:06:01.202 Allocate depth: 32 00:06:01.202 # threads/core: 1 00:06:01.202 Run time: 1 seconds 00:06:01.202 Verify: Yes 00:06:01.202 00:06:01.202 Running for 1 seconds... 00:06:01.202 00:06:01.202 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:01.202 ------------------------------------------------------------------------------------ 00:06:01.202 0,0 614848/s 4803 MiB/s 0 0 00:06:01.202 ==================================================================================== 00:06:01.202 Total 614848/s 2401 MiB/s 0 0' 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:01.202 06:15:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:01.202 06:15:30 -- accel/accel.sh@12 -- # build_accel_config 00:06:01.202 06:15:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:01.202 06:15:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:01.202 06:15:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:01.202 06:15:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:01.202 06:15:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:01.202 06:15:30 -- accel/accel.sh@41 -- # local IFS=, 00:06:01.202 06:15:30 -- accel/accel.sh@42 -- # jq -r . 00:06:01.202 [2024-11-27 06:15:30.392550] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:01.202 [2024-11-27 06:15:30.392647] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid18468 ] 00:06:01.202 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.202 [2024-11-27 06:15:30.464524] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.202 [2024-11-27 06:15:30.532846] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val= 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val= 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val=0x1 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val= 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val= 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val=crc32c 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val=0 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val= 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val=software 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@23 -- # accel_module=software 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val=32 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val=32 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val=1 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val=Yes 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val= 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:01.202 06:15:30 -- accel/accel.sh@21 -- # val= 00:06:01.202 06:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # IFS=: 00:06:01.202 06:15:30 -- accel/accel.sh@20 -- # read -r var val 00:06:02.583 06:15:31 -- accel/accel.sh@21 -- # val= 00:06:02.583 06:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.583 06:15:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.583 06:15:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.583 06:15:31 -- accel/accel.sh@21 -- # val= 00:06:02.583 06:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.583 06:15:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.583 06:15:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.583 06:15:31 -- accel/accel.sh@21 -- # val= 00:06:02.583 06:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.583 06:15:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.583 06:15:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.583 06:15:31 -- accel/accel.sh@21 -- # val= 00:06:02.583 06:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.583 06:15:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.583 06:15:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.583 06:15:31 -- accel/accel.sh@21 -- # val= 00:06:02.583 06:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.583 06:15:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.583 06:15:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.583 06:15:31 -- accel/accel.sh@21 -- # val= 00:06:02.583 06:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.583 06:15:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.583 06:15:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.583 06:15:31 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:02.583 06:15:31 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:02.583 06:15:31 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:02.583 00:06:02.583 real 0m2.661s 00:06:02.583 user 0m2.394s 00:06:02.583 sys 0m0.265s 00:06:02.583 06:15:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:02.583 06:15:31 -- common/autotest_common.sh@10 -- # set +x 00:06:02.583 ************************************ 00:06:02.583 END TEST accel_crc32c_C2 00:06:02.584 ************************************ 00:06:02.584 06:15:31 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:02.584 06:15:31 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:02.584 06:15:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:02.584 06:15:31 -- common/autotest_common.sh@10 -- # set +x 00:06:02.584 ************************************ 00:06:02.584 START TEST accel_copy 00:06:02.584 ************************************ 00:06:02.584 06:15:31 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy -y 00:06:02.584 06:15:31 -- accel/accel.sh@16 -- # local accel_opc 00:06:02.584 06:15:31 -- accel/accel.sh@17 -- # local accel_module 00:06:02.584 06:15:31 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:02.584 06:15:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:02.584 06:15:31 -- accel/accel.sh@12 -- # build_accel_config 00:06:02.584 06:15:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:02.584 06:15:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:02.584 06:15:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:02.584 06:15:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:02.584 06:15:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:02.584 06:15:31 -- accel/accel.sh@41 -- # local IFS=, 00:06:02.584 06:15:31 -- accel/accel.sh@42 -- # jq -r . 00:06:02.584 [2024-11-27 06:15:31.768867] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:02.584 [2024-11-27 06:15:31.768959] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid18681 ] 00:06:02.584 EAL: No free 2048 kB hugepages reported on node 1 00:06:02.584 [2024-11-27 06:15:31.839217] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:02.584 [2024-11-27 06:15:31.907008] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.963 06:15:33 -- accel/accel.sh@18 -- # out=' 00:06:03.963 SPDK Configuration: 00:06:03.963 Core mask: 0x1 00:06:03.963 00:06:03.963 Accel Perf Configuration: 00:06:03.963 Workload Type: copy 00:06:03.963 Transfer size: 4096 bytes 00:06:03.963 Vector count 1 00:06:03.963 Module: software 00:06:03.963 Queue depth: 32 00:06:03.963 Allocate depth: 32 00:06:03.963 # threads/core: 1 00:06:03.963 Run time: 1 seconds 00:06:03.963 Verify: Yes 00:06:03.963 00:06:03.963 Running for 1 seconds... 00:06:03.963 00:06:03.963 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:03.963 ------------------------------------------------------------------------------------ 00:06:03.963 0,0 546368/s 2134 MiB/s 0 0 00:06:03.963 ==================================================================================== 00:06:03.964 Total 546368/s 2134 MiB/s 0 0' 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:03.964 06:15:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:03.964 06:15:33 -- accel/accel.sh@12 -- # build_accel_config 00:06:03.964 06:15:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:03.964 06:15:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:03.964 06:15:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:03.964 06:15:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:03.964 06:15:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:03.964 06:15:33 -- accel/accel.sh@41 -- # local IFS=, 00:06:03.964 06:15:33 -- accel/accel.sh@42 -- # jq -r . 00:06:03.964 [2024-11-27 06:15:33.096320] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:03.964 [2024-11-27 06:15:33.096412] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid18931 ] 00:06:03.964 EAL: No free 2048 kB hugepages reported on node 1 00:06:03.964 [2024-11-27 06:15:33.164093] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.964 [2024-11-27 06:15:33.230109] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val= 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val= 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val=0x1 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val= 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val= 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val=copy 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val= 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val=software 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@23 -- # accel_module=software 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val=32 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val=32 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val=1 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val=Yes 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val= 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:03.964 06:15:33 -- accel/accel.sh@21 -- # val= 00:06:03.964 06:15:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # IFS=: 00:06:03.964 06:15:33 -- accel/accel.sh@20 -- # read -r var val 00:06:04.904 06:15:34 -- accel/accel.sh@21 -- # val= 00:06:04.904 06:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:04.904 06:15:34 -- accel/accel.sh@20 -- # IFS=: 00:06:04.904 06:15:34 -- accel/accel.sh@20 -- # read -r var val 00:06:04.904 06:15:34 -- accel/accel.sh@21 -- # val= 00:06:04.904 06:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:04.904 06:15:34 -- accel/accel.sh@20 -- # IFS=: 00:06:04.904 06:15:34 -- accel/accel.sh@20 -- # read -r var val 00:06:04.904 06:15:34 -- accel/accel.sh@21 -- # val= 00:06:04.904 06:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:04.904 06:15:34 -- accel/accel.sh@20 -- # IFS=: 00:06:04.904 06:15:34 -- accel/accel.sh@20 -- # read -r var val 00:06:04.904 06:15:34 -- accel/accel.sh@21 -- # val= 00:06:04.904 06:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:04.904 06:15:34 -- accel/accel.sh@20 -- # IFS=: 00:06:04.904 06:15:34 -- accel/accel.sh@20 -- # read -r var val 00:06:04.904 06:15:34 -- accel/accel.sh@21 -- # val= 00:06:04.904 06:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:04.904 06:15:34 -- accel/accel.sh@20 -- # IFS=: 00:06:04.904 06:15:34 -- accel/accel.sh@20 -- # read -r var val 00:06:04.904 06:15:34 -- accel/accel.sh@21 -- # val= 00:06:04.904 06:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:04.904 06:15:34 -- accel/accel.sh@20 -- # IFS=: 00:06:04.904 06:15:34 -- accel/accel.sh@20 -- # read -r var val 00:06:04.904 06:15:34 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:04.904 06:15:34 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:04.904 06:15:34 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:04.904 00:06:04.904 real 0m2.653s 00:06:04.904 user 0m2.401s 00:06:04.904 sys 0m0.249s 00:06:04.904 06:15:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:04.904 06:15:34 -- common/autotest_common.sh@10 -- # set +x 00:06:04.904 ************************************ 00:06:04.904 END TEST accel_copy 00:06:04.904 ************************************ 00:06:04.904 06:15:34 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:04.904 06:15:34 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:05.163 06:15:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:05.163 06:15:34 -- common/autotest_common.sh@10 -- # set +x 00:06:05.164 ************************************ 00:06:05.164 START TEST accel_fill 00:06:05.164 ************************************ 00:06:05.164 06:15:34 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:05.164 06:15:34 -- accel/accel.sh@16 -- # local accel_opc 00:06:05.164 06:15:34 -- accel/accel.sh@17 -- # local accel_module 00:06:05.164 06:15:34 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:05.164 06:15:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:05.164 06:15:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:05.164 06:15:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:05.164 06:15:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:05.164 06:15:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:05.164 06:15:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:05.164 06:15:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:05.164 06:15:34 -- accel/accel.sh@41 -- # local IFS=, 00:06:05.164 06:15:34 -- accel/accel.sh@42 -- # jq -r . 00:06:05.164 [2024-11-27 06:15:34.465000] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:05.164 [2024-11-27 06:15:34.465076] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid19218 ] 00:06:05.164 EAL: No free 2048 kB hugepages reported on node 1 00:06:05.164 [2024-11-27 06:15:34.532812] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.164 [2024-11-27 06:15:34.600449] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.544 06:15:35 -- accel/accel.sh@18 -- # out=' 00:06:06.544 SPDK Configuration: 00:06:06.544 Core mask: 0x1 00:06:06.544 00:06:06.544 Accel Perf Configuration: 00:06:06.544 Workload Type: fill 00:06:06.544 Fill pattern: 0x80 00:06:06.544 Transfer size: 4096 bytes 00:06:06.544 Vector count 1 00:06:06.544 Module: software 00:06:06.544 Queue depth: 64 00:06:06.544 Allocate depth: 64 00:06:06.544 # threads/core: 1 00:06:06.544 Run time: 1 seconds 00:06:06.544 Verify: Yes 00:06:06.544 00:06:06.544 Running for 1 seconds... 00:06:06.544 00:06:06.544 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:06.544 ------------------------------------------------------------------------------------ 00:06:06.544 0,0 979456/s 3826 MiB/s 0 0 00:06:06.544 ==================================================================================== 00:06:06.545 Total 979456/s 3826 MiB/s 0 0' 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:06.545 06:15:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:06.545 06:15:35 -- accel/accel.sh@12 -- # build_accel_config 00:06:06.545 06:15:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:06.545 06:15:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:06.545 06:15:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:06.545 06:15:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:06.545 06:15:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:06.545 06:15:35 -- accel/accel.sh@41 -- # local IFS=, 00:06:06.545 06:15:35 -- accel/accel.sh@42 -- # jq -r . 00:06:06.545 [2024-11-27 06:15:35.788155] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:06.545 [2024-11-27 06:15:35.788251] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid19486 ] 00:06:06.545 EAL: No free 2048 kB hugepages reported on node 1 00:06:06.545 [2024-11-27 06:15:35.856550] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:06.545 [2024-11-27 06:15:35.922545] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val= 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val= 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val=0x1 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val= 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val= 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val=fill 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val=0x80 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val= 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val=software 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@23 -- # accel_module=software 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val=64 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val=64 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val=1 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val=Yes 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val= 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.545 06:15:35 -- accel/accel.sh@21 -- # val= 00:06:06.545 06:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.545 06:15:35 -- accel/accel.sh@20 -- # read -r var val 00:06:07.925 06:15:37 -- accel/accel.sh@21 -- # val= 00:06:07.925 06:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.925 06:15:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.925 06:15:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.925 06:15:37 -- accel/accel.sh@21 -- # val= 00:06:07.925 06:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.925 06:15:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.925 06:15:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.925 06:15:37 -- accel/accel.sh@21 -- # val= 00:06:07.925 06:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.925 06:15:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.925 06:15:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.925 06:15:37 -- accel/accel.sh@21 -- # val= 00:06:07.925 06:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.925 06:15:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.925 06:15:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.925 06:15:37 -- accel/accel.sh@21 -- # val= 00:06:07.925 06:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.925 06:15:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.925 06:15:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.925 06:15:37 -- accel/accel.sh@21 -- # val= 00:06:07.925 06:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.925 06:15:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.925 06:15:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.925 06:15:37 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:07.925 06:15:37 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:07.925 06:15:37 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:07.925 00:06:07.925 real 0m2.649s 00:06:07.925 user 0m2.408s 00:06:07.925 sys 0m0.239s 00:06:07.925 06:15:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:07.925 06:15:37 -- common/autotest_common.sh@10 -- # set +x 00:06:07.925 ************************************ 00:06:07.925 END TEST accel_fill 00:06:07.925 ************************************ 00:06:07.925 06:15:37 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:07.925 06:15:37 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:07.925 06:15:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:07.925 06:15:37 -- common/autotest_common.sh@10 -- # set +x 00:06:07.925 ************************************ 00:06:07.925 START TEST accel_copy_crc32c 00:06:07.925 ************************************ 00:06:07.925 06:15:37 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y 00:06:07.925 06:15:37 -- accel/accel.sh@16 -- # local accel_opc 00:06:07.926 06:15:37 -- accel/accel.sh@17 -- # local accel_module 00:06:07.926 06:15:37 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:07.926 06:15:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:07.926 06:15:37 -- accel/accel.sh@12 -- # build_accel_config 00:06:07.926 06:15:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:07.926 06:15:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:07.926 06:15:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:07.926 06:15:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:07.926 06:15:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:07.926 06:15:37 -- accel/accel.sh@41 -- # local IFS=, 00:06:07.926 06:15:37 -- accel/accel.sh@42 -- # jq -r . 00:06:07.926 [2024-11-27 06:15:37.157178] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:07.926 [2024-11-27 06:15:37.157263] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid19772 ] 00:06:07.926 EAL: No free 2048 kB hugepages reported on node 1 00:06:07.926 [2024-11-27 06:15:37.226990] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.926 [2024-11-27 06:15:37.294336] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.306 06:15:38 -- accel/accel.sh@18 -- # out=' 00:06:09.306 SPDK Configuration: 00:06:09.306 Core mask: 0x1 00:06:09.306 00:06:09.306 Accel Perf Configuration: 00:06:09.306 Workload Type: copy_crc32c 00:06:09.306 CRC-32C seed: 0 00:06:09.306 Vector size: 4096 bytes 00:06:09.306 Transfer size: 4096 bytes 00:06:09.306 Vector count 1 00:06:09.306 Module: software 00:06:09.306 Queue depth: 32 00:06:09.306 Allocate depth: 32 00:06:09.306 # threads/core: 1 00:06:09.306 Run time: 1 seconds 00:06:09.306 Verify: Yes 00:06:09.306 00:06:09.306 Running for 1 seconds... 00:06:09.306 00:06:09.306 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:09.306 ------------------------------------------------------------------------------------ 00:06:09.306 0,0 430080/s 1680 MiB/s 0 0 00:06:09.306 ==================================================================================== 00:06:09.306 Total 430080/s 1680 MiB/s 0 0' 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:09.306 06:15:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:09.306 06:15:38 -- accel/accel.sh@12 -- # build_accel_config 00:06:09.306 06:15:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:09.306 06:15:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:09.306 06:15:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:09.306 06:15:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:09.306 06:15:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:09.306 06:15:38 -- accel/accel.sh@41 -- # local IFS=, 00:06:09.306 06:15:38 -- accel/accel.sh@42 -- # jq -r . 00:06:09.306 [2024-11-27 06:15:38.484188] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:09.306 [2024-11-27 06:15:38.484280] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid20040 ] 00:06:09.306 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.306 [2024-11-27 06:15:38.552571] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.306 [2024-11-27 06:15:38.618387] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val= 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val= 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val=0x1 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val= 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val= 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val=0 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val= 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val=software 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@23 -- # accel_module=software 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val=32 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val=32 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.306 06:15:38 -- accel/accel.sh@21 -- # val=1 00:06:09.306 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.306 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.307 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.307 06:15:38 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:09.307 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.307 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.307 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.307 06:15:38 -- accel/accel.sh@21 -- # val=Yes 00:06:09.307 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.307 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.307 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.307 06:15:38 -- accel/accel.sh@21 -- # val= 00:06:09.307 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.307 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.307 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:09.307 06:15:38 -- accel/accel.sh@21 -- # val= 00:06:09.307 06:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.307 06:15:38 -- accel/accel.sh@20 -- # IFS=: 00:06:09.307 06:15:38 -- accel/accel.sh@20 -- # read -r var val 00:06:10.688 06:15:39 -- accel/accel.sh@21 -- # val= 00:06:10.688 06:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.688 06:15:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.688 06:15:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.688 06:15:39 -- accel/accel.sh@21 -- # val= 00:06:10.688 06:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.688 06:15:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.688 06:15:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.688 06:15:39 -- accel/accel.sh@21 -- # val= 00:06:10.688 06:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.688 06:15:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.688 06:15:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.688 06:15:39 -- accel/accel.sh@21 -- # val= 00:06:10.688 06:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.688 06:15:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.688 06:15:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.688 06:15:39 -- accel/accel.sh@21 -- # val= 00:06:10.688 06:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.688 06:15:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.688 06:15:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.688 06:15:39 -- accel/accel.sh@21 -- # val= 00:06:10.688 06:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.688 06:15:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.688 06:15:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.688 06:15:39 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:10.688 06:15:39 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:10.688 06:15:39 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:10.688 00:06:10.688 real 0m2.653s 00:06:10.688 user 0m2.393s 00:06:10.688 sys 0m0.259s 00:06:10.688 06:15:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:10.688 06:15:39 -- common/autotest_common.sh@10 -- # set +x 00:06:10.688 ************************************ 00:06:10.688 END TEST accel_copy_crc32c 00:06:10.688 ************************************ 00:06:10.688 06:15:39 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:10.688 06:15:39 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:10.688 06:15:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:10.688 06:15:39 -- common/autotest_common.sh@10 -- # set +x 00:06:10.688 ************************************ 00:06:10.688 START TEST accel_copy_crc32c_C2 00:06:10.688 ************************************ 00:06:10.688 06:15:39 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:10.688 06:15:39 -- accel/accel.sh@16 -- # local accel_opc 00:06:10.688 06:15:39 -- accel/accel.sh@17 -- # local accel_module 00:06:10.688 06:15:39 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:10.688 06:15:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:10.688 06:15:39 -- accel/accel.sh@12 -- # build_accel_config 00:06:10.688 06:15:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:10.688 06:15:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:10.688 06:15:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:10.688 06:15:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:10.688 06:15:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:10.688 06:15:39 -- accel/accel.sh@41 -- # local IFS=, 00:06:10.688 06:15:39 -- accel/accel.sh@42 -- # jq -r . 00:06:10.688 [2024-11-27 06:15:39.854112] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:10.688 [2024-11-27 06:15:39.854205] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid20266 ] 00:06:10.688 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.688 [2024-11-27 06:15:39.923113] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.688 [2024-11-27 06:15:39.991298] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.627 06:15:41 -- accel/accel.sh@18 -- # out=' 00:06:11.627 SPDK Configuration: 00:06:11.627 Core mask: 0x1 00:06:11.627 00:06:11.627 Accel Perf Configuration: 00:06:11.627 Workload Type: copy_crc32c 00:06:11.627 CRC-32C seed: 0 00:06:11.627 Vector size: 4096 bytes 00:06:11.627 Transfer size: 8192 bytes 00:06:11.627 Vector count 2 00:06:11.627 Module: software 00:06:11.627 Queue depth: 32 00:06:11.627 Allocate depth: 32 00:06:11.627 # threads/core: 1 00:06:11.627 Run time: 1 seconds 00:06:11.627 Verify: Yes 00:06:11.627 00:06:11.627 Running for 1 seconds... 00:06:11.627 00:06:11.627 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:11.627 ------------------------------------------------------------------------------------ 00:06:11.627 0,0 295264/s 2306 MiB/s 0 0 00:06:11.627 ==================================================================================== 00:06:11.627 Total 295264/s 1153 MiB/s 0 0' 00:06:11.887 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.887 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.887 06:15:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:11.887 06:15:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:11.887 06:15:41 -- accel/accel.sh@12 -- # build_accel_config 00:06:11.887 06:15:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:11.887 06:15:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:11.887 06:15:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:11.887 06:15:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:11.887 06:15:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:11.887 06:15:41 -- accel/accel.sh@41 -- # local IFS=, 00:06:11.887 06:15:41 -- accel/accel.sh@42 -- # jq -r . 00:06:11.887 [2024-11-27 06:15:41.182960] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:11.887 [2024-11-27 06:15:41.183045] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid20442 ] 00:06:11.887 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.887 [2024-11-27 06:15:41.251902] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.887 [2024-11-27 06:15:41.321007] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.887 06:15:41 -- accel/accel.sh@21 -- # val= 00:06:11.887 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.887 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.887 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.887 06:15:41 -- accel/accel.sh@21 -- # val= 00:06:11.887 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.887 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.887 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.887 06:15:41 -- accel/accel.sh@21 -- # val=0x1 00:06:11.887 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.887 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.887 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.887 06:15:41 -- accel/accel.sh@21 -- # val= 00:06:11.887 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.887 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.887 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.887 06:15:41 -- accel/accel.sh@21 -- # val= 00:06:11.887 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.887 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.887 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val=0 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val= 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val=software 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@23 -- # accel_module=software 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val=32 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val=32 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val=1 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val=Yes 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val= 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:11.888 06:15:41 -- accel/accel.sh@21 -- # val= 00:06:11.888 06:15:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # IFS=: 00:06:11.888 06:15:41 -- accel/accel.sh@20 -- # read -r var val 00:06:13.271 06:15:42 -- accel/accel.sh@21 -- # val= 00:06:13.271 06:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.271 06:15:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.271 06:15:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.271 06:15:42 -- accel/accel.sh@21 -- # val= 00:06:13.271 06:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.271 06:15:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.271 06:15:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.271 06:15:42 -- accel/accel.sh@21 -- # val= 00:06:13.271 06:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.271 06:15:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.271 06:15:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.271 06:15:42 -- accel/accel.sh@21 -- # val= 00:06:13.271 06:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.271 06:15:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.271 06:15:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.271 06:15:42 -- accel/accel.sh@21 -- # val= 00:06:13.271 06:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.271 06:15:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.271 06:15:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.271 06:15:42 -- accel/accel.sh@21 -- # val= 00:06:13.271 06:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.271 06:15:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.271 06:15:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.271 06:15:42 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:13.271 06:15:42 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:13.271 06:15:42 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:13.271 00:06:13.271 real 0m2.657s 00:06:13.271 user 0m2.401s 00:06:13.271 sys 0m0.254s 00:06:13.271 06:15:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:13.271 06:15:42 -- common/autotest_common.sh@10 -- # set +x 00:06:13.271 ************************************ 00:06:13.271 END TEST accel_copy_crc32c_C2 00:06:13.271 ************************************ 00:06:13.271 06:15:42 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:13.271 06:15:42 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:13.271 06:15:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:13.271 06:15:42 -- common/autotest_common.sh@10 -- # set +x 00:06:13.271 ************************************ 00:06:13.271 START TEST accel_dualcast 00:06:13.271 ************************************ 00:06:13.271 06:15:42 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dualcast -y 00:06:13.271 06:15:42 -- accel/accel.sh@16 -- # local accel_opc 00:06:13.271 06:15:42 -- accel/accel.sh@17 -- # local accel_module 00:06:13.271 06:15:42 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:13.271 06:15:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:13.271 06:15:42 -- accel/accel.sh@12 -- # build_accel_config 00:06:13.271 06:15:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:13.271 06:15:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:13.271 06:15:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:13.271 06:15:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:13.271 06:15:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:13.271 06:15:42 -- accel/accel.sh@41 -- # local IFS=, 00:06:13.271 06:15:42 -- accel/accel.sh@42 -- # jq -r . 00:06:13.271 [2024-11-27 06:15:42.560758] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:13.271 [2024-11-27 06:15:42.560848] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid20654 ] 00:06:13.271 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.271 [2024-11-27 06:15:42.629253] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.271 [2024-11-27 06:15:42.701880] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.652 06:15:43 -- accel/accel.sh@18 -- # out=' 00:06:14.652 SPDK Configuration: 00:06:14.652 Core mask: 0x1 00:06:14.652 00:06:14.652 Accel Perf Configuration: 00:06:14.652 Workload Type: dualcast 00:06:14.652 Transfer size: 4096 bytes 00:06:14.652 Vector count 1 00:06:14.652 Module: software 00:06:14.652 Queue depth: 32 00:06:14.652 Allocate depth: 32 00:06:14.652 # threads/core: 1 00:06:14.652 Run time: 1 seconds 00:06:14.652 Verify: Yes 00:06:14.652 00:06:14.652 Running for 1 seconds... 00:06:14.652 00:06:14.652 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:14.652 ------------------------------------------------------------------------------------ 00:06:14.652 0,0 628896/s 2456 MiB/s 0 0 00:06:14.652 ==================================================================================== 00:06:14.652 Total 628896/s 2456 MiB/s 0 0' 00:06:14.652 06:15:43 -- accel/accel.sh@20 -- # IFS=: 00:06:14.652 06:15:43 -- accel/accel.sh@20 -- # read -r var val 00:06:14.652 06:15:43 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:14.652 06:15:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:14.652 06:15:43 -- accel/accel.sh@12 -- # build_accel_config 00:06:14.652 06:15:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:14.652 06:15:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:14.652 06:15:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:14.652 06:15:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:14.652 06:15:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:14.652 06:15:43 -- accel/accel.sh@41 -- # local IFS=, 00:06:14.652 06:15:43 -- accel/accel.sh@42 -- # jq -r . 00:06:14.652 [2024-11-27 06:15:43.891111] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:14.652 [2024-11-27 06:15:43.891200] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid20906 ] 00:06:14.652 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.653 [2024-11-27 06:15:43.959490] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.653 [2024-11-27 06:15:44.031821] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val= 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val= 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val=0x1 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val= 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val= 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val=dualcast 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val= 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val=software 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@23 -- # accel_module=software 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val=32 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val=32 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val=1 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val=Yes 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val= 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:14.653 06:15:44 -- accel/accel.sh@21 -- # val= 00:06:14.653 06:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # IFS=: 00:06:14.653 06:15:44 -- accel/accel.sh@20 -- # read -r var val 00:06:16.033 06:15:45 -- accel/accel.sh@21 -- # val= 00:06:16.033 06:15:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.033 06:15:45 -- accel/accel.sh@20 -- # IFS=: 00:06:16.033 06:15:45 -- accel/accel.sh@20 -- # read -r var val 00:06:16.033 06:15:45 -- accel/accel.sh@21 -- # val= 00:06:16.033 06:15:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.033 06:15:45 -- accel/accel.sh@20 -- # IFS=: 00:06:16.033 06:15:45 -- accel/accel.sh@20 -- # read -r var val 00:06:16.033 06:15:45 -- accel/accel.sh@21 -- # val= 00:06:16.033 06:15:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.033 06:15:45 -- accel/accel.sh@20 -- # IFS=: 00:06:16.033 06:15:45 -- accel/accel.sh@20 -- # read -r var val 00:06:16.033 06:15:45 -- accel/accel.sh@21 -- # val= 00:06:16.033 06:15:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.033 06:15:45 -- accel/accel.sh@20 -- # IFS=: 00:06:16.033 06:15:45 -- accel/accel.sh@20 -- # read -r var val 00:06:16.033 06:15:45 -- accel/accel.sh@21 -- # val= 00:06:16.033 06:15:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.033 06:15:45 -- accel/accel.sh@20 -- # IFS=: 00:06:16.033 06:15:45 -- accel/accel.sh@20 -- # read -r var val 00:06:16.033 06:15:45 -- accel/accel.sh@21 -- # val= 00:06:16.033 06:15:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.033 06:15:45 -- accel/accel.sh@20 -- # IFS=: 00:06:16.033 06:15:45 -- accel/accel.sh@20 -- # read -r var val 00:06:16.033 06:15:45 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:16.033 06:15:45 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:16.033 06:15:45 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:16.033 00:06:16.033 real 0m2.664s 00:06:16.033 user 0m2.404s 00:06:16.033 sys 0m0.258s 00:06:16.033 06:15:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:16.033 06:15:45 -- common/autotest_common.sh@10 -- # set +x 00:06:16.033 ************************************ 00:06:16.033 END TEST accel_dualcast 00:06:16.033 ************************************ 00:06:16.033 06:15:45 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:16.033 06:15:45 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:16.033 06:15:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:16.033 06:15:45 -- common/autotest_common.sh@10 -- # set +x 00:06:16.033 ************************************ 00:06:16.033 START TEST accel_compare 00:06:16.033 ************************************ 00:06:16.033 06:15:45 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compare -y 00:06:16.033 06:15:45 -- accel/accel.sh@16 -- # local accel_opc 00:06:16.033 06:15:45 -- accel/accel.sh@17 -- # local accel_module 00:06:16.033 06:15:45 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:16.033 06:15:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:16.033 06:15:45 -- accel/accel.sh@12 -- # build_accel_config 00:06:16.033 06:15:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:16.033 06:15:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:16.033 06:15:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:16.033 06:15:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:16.033 06:15:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:16.033 06:15:45 -- accel/accel.sh@41 -- # local IFS=, 00:06:16.033 06:15:45 -- accel/accel.sh@42 -- # jq -r . 00:06:16.033 [2024-11-27 06:15:45.267069] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:16.033 [2024-11-27 06:15:45.267159] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid21194 ] 00:06:16.033 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.033 [2024-11-27 06:15:45.335473] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.033 [2024-11-27 06:15:45.402543] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.413 06:15:46 -- accel/accel.sh@18 -- # out=' 00:06:17.413 SPDK Configuration: 00:06:17.413 Core mask: 0x1 00:06:17.413 00:06:17.413 Accel Perf Configuration: 00:06:17.413 Workload Type: compare 00:06:17.413 Transfer size: 4096 bytes 00:06:17.413 Vector count 1 00:06:17.413 Module: software 00:06:17.413 Queue depth: 32 00:06:17.413 Allocate depth: 32 00:06:17.413 # threads/core: 1 00:06:17.413 Run time: 1 seconds 00:06:17.413 Verify: Yes 00:06:17.413 00:06:17.413 Running for 1 seconds... 00:06:17.413 00:06:17.413 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:17.413 ------------------------------------------------------------------------------------ 00:06:17.413 0,0 780000/s 3046 MiB/s 0 0 00:06:17.413 ==================================================================================== 00:06:17.413 Total 780000/s 3046 MiB/s 0 0' 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:17.413 06:15:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:17.413 06:15:46 -- accel/accel.sh@12 -- # build_accel_config 00:06:17.413 06:15:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:17.413 06:15:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:17.413 06:15:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:17.413 06:15:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:17.413 06:15:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:17.413 06:15:46 -- accel/accel.sh@41 -- # local IFS=, 00:06:17.413 06:15:46 -- accel/accel.sh@42 -- # jq -r . 00:06:17.413 [2024-11-27 06:15:46.589609] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:17.413 [2024-11-27 06:15:46.589700] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid21462 ] 00:06:17.413 EAL: No free 2048 kB hugepages reported on node 1 00:06:17.413 [2024-11-27 06:15:46.658329] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.413 [2024-11-27 06:15:46.724127] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val= 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val= 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val=0x1 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val= 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val= 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val=compare 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val= 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val=software 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@23 -- # accel_module=software 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val=32 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val=32 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val=1 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val=Yes 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.413 06:15:46 -- accel/accel.sh@21 -- # val= 00:06:17.413 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.413 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:17.414 06:15:46 -- accel/accel.sh@21 -- # val= 00:06:17.414 06:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.414 06:15:46 -- accel/accel.sh@20 -- # IFS=: 00:06:17.414 06:15:46 -- accel/accel.sh@20 -- # read -r var val 00:06:18.794 06:15:47 -- accel/accel.sh@21 -- # val= 00:06:18.794 06:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.794 06:15:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.794 06:15:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.794 06:15:47 -- accel/accel.sh@21 -- # val= 00:06:18.794 06:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.794 06:15:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.794 06:15:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.794 06:15:47 -- accel/accel.sh@21 -- # val= 00:06:18.794 06:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.794 06:15:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.794 06:15:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.794 06:15:47 -- accel/accel.sh@21 -- # val= 00:06:18.794 06:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.794 06:15:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.794 06:15:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.794 06:15:47 -- accel/accel.sh@21 -- # val= 00:06:18.794 06:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.794 06:15:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.794 06:15:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.794 06:15:47 -- accel/accel.sh@21 -- # val= 00:06:18.794 06:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.794 06:15:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.794 06:15:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.794 06:15:47 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:18.794 06:15:47 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:18.794 06:15:47 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:18.794 00:06:18.794 real 0m2.645s 00:06:18.794 user 0m2.394s 00:06:18.794 sys 0m0.248s 00:06:18.794 06:15:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:18.794 06:15:47 -- common/autotest_common.sh@10 -- # set +x 00:06:18.794 ************************************ 00:06:18.794 END TEST accel_compare 00:06:18.794 ************************************ 00:06:18.794 06:15:47 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:18.794 06:15:47 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:18.794 06:15:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.794 06:15:47 -- common/autotest_common.sh@10 -- # set +x 00:06:18.794 ************************************ 00:06:18.794 START TEST accel_xor 00:06:18.794 ************************************ 00:06:18.794 06:15:47 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y 00:06:18.794 06:15:47 -- accel/accel.sh@16 -- # local accel_opc 00:06:18.794 06:15:47 -- accel/accel.sh@17 -- # local accel_module 00:06:18.794 06:15:47 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:18.794 06:15:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:18.794 06:15:47 -- accel/accel.sh@12 -- # build_accel_config 00:06:18.794 06:15:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:18.794 06:15:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:18.794 06:15:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:18.794 06:15:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:18.794 06:15:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:18.794 06:15:47 -- accel/accel.sh@41 -- # local IFS=, 00:06:18.794 06:15:47 -- accel/accel.sh@42 -- # jq -r . 00:06:18.794 [2024-11-27 06:15:47.956496] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:18.794 [2024-11-27 06:15:47.956585] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid21744 ] 00:06:18.794 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.795 [2024-11-27 06:15:48.024762] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.795 [2024-11-27 06:15:48.092243] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.734 06:15:49 -- accel/accel.sh@18 -- # out=' 00:06:19.734 SPDK Configuration: 00:06:19.734 Core mask: 0x1 00:06:19.734 00:06:19.734 Accel Perf Configuration: 00:06:19.734 Workload Type: xor 00:06:19.734 Source buffers: 2 00:06:19.734 Transfer size: 4096 bytes 00:06:19.734 Vector count 1 00:06:19.734 Module: software 00:06:19.734 Queue depth: 32 00:06:19.734 Allocate depth: 32 00:06:19.734 # threads/core: 1 00:06:19.734 Run time: 1 seconds 00:06:19.734 Verify: Yes 00:06:19.734 00:06:19.734 Running for 1 seconds... 00:06:19.734 00:06:19.734 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:19.734 ------------------------------------------------------------------------------------ 00:06:19.734 0,0 719712/s 2811 MiB/s 0 0 00:06:19.734 ==================================================================================== 00:06:19.734 Total 719712/s 2811 MiB/s 0 0' 00:06:19.734 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.734 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.734 06:15:49 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:19.734 06:15:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:19.734 06:15:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:19.734 06:15:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:19.734 06:15:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:19.734 06:15:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:19.734 06:15:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:19.734 06:15:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:19.734 06:15:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:19.734 06:15:49 -- accel/accel.sh@42 -- # jq -r . 00:06:19.993 [2024-11-27 06:15:49.281886] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:19.993 [2024-11-27 06:15:49.281974] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid22020 ] 00:06:19.993 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.993 [2024-11-27 06:15:49.349953] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.993 [2024-11-27 06:15:49.415760] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.993 06:15:49 -- accel/accel.sh@21 -- # val= 00:06:19.993 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.993 06:15:49 -- accel/accel.sh@21 -- # val= 00:06:19.993 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.993 06:15:49 -- accel/accel.sh@21 -- # val=0x1 00:06:19.993 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.993 06:15:49 -- accel/accel.sh@21 -- # val= 00:06:19.993 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.993 06:15:49 -- accel/accel.sh@21 -- # val= 00:06:19.993 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.993 06:15:49 -- accel/accel.sh@21 -- # val=xor 00:06:19.993 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.993 06:15:49 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.993 06:15:49 -- accel/accel.sh@21 -- # val=2 00:06:19.993 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.993 06:15:49 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:19.993 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.993 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.993 06:15:49 -- accel/accel.sh@21 -- # val= 00:06:19.994 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.994 06:15:49 -- accel/accel.sh@21 -- # val=software 00:06:19.994 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.994 06:15:49 -- accel/accel.sh@23 -- # accel_module=software 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.994 06:15:49 -- accel/accel.sh@21 -- # val=32 00:06:19.994 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.994 06:15:49 -- accel/accel.sh@21 -- # val=32 00:06:19.994 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.994 06:15:49 -- accel/accel.sh@21 -- # val=1 00:06:19.994 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.994 06:15:49 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:19.994 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.994 06:15:49 -- accel/accel.sh@21 -- # val=Yes 00:06:19.994 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.994 06:15:49 -- accel/accel.sh@21 -- # val= 00:06:19.994 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:19.994 06:15:49 -- accel/accel.sh@21 -- # val= 00:06:19.994 06:15:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # IFS=: 00:06:19.994 06:15:49 -- accel/accel.sh@20 -- # read -r var val 00:06:21.374 06:15:50 -- accel/accel.sh@21 -- # val= 00:06:21.374 06:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.374 06:15:50 -- accel/accel.sh@20 -- # IFS=: 00:06:21.374 06:15:50 -- accel/accel.sh@20 -- # read -r var val 00:06:21.374 06:15:50 -- accel/accel.sh@21 -- # val= 00:06:21.374 06:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.374 06:15:50 -- accel/accel.sh@20 -- # IFS=: 00:06:21.374 06:15:50 -- accel/accel.sh@20 -- # read -r var val 00:06:21.374 06:15:50 -- accel/accel.sh@21 -- # val= 00:06:21.374 06:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.374 06:15:50 -- accel/accel.sh@20 -- # IFS=: 00:06:21.374 06:15:50 -- accel/accel.sh@20 -- # read -r var val 00:06:21.374 06:15:50 -- accel/accel.sh@21 -- # val= 00:06:21.374 06:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.374 06:15:50 -- accel/accel.sh@20 -- # IFS=: 00:06:21.374 06:15:50 -- accel/accel.sh@20 -- # read -r var val 00:06:21.374 06:15:50 -- accel/accel.sh@21 -- # val= 00:06:21.374 06:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.374 06:15:50 -- accel/accel.sh@20 -- # IFS=: 00:06:21.374 06:15:50 -- accel/accel.sh@20 -- # read -r var val 00:06:21.374 06:15:50 -- accel/accel.sh@21 -- # val= 00:06:21.374 06:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.374 06:15:50 -- accel/accel.sh@20 -- # IFS=: 00:06:21.374 06:15:50 -- accel/accel.sh@20 -- # read -r var val 00:06:21.374 06:15:50 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:21.374 06:15:50 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:21.374 06:15:50 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:21.374 00:06:21.374 real 0m2.651s 00:06:21.374 user 0m2.410s 00:06:21.374 sys 0m0.239s 00:06:21.374 06:15:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:21.374 06:15:50 -- common/autotest_common.sh@10 -- # set +x 00:06:21.374 ************************************ 00:06:21.374 END TEST accel_xor 00:06:21.374 ************************************ 00:06:21.374 06:15:50 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:21.374 06:15:50 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:21.374 06:15:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:21.374 06:15:50 -- common/autotest_common.sh@10 -- # set +x 00:06:21.374 ************************************ 00:06:21.374 START TEST accel_xor 00:06:21.374 ************************************ 00:06:21.374 06:15:50 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y -x 3 00:06:21.374 06:15:50 -- accel/accel.sh@16 -- # local accel_opc 00:06:21.374 06:15:50 -- accel/accel.sh@17 -- # local accel_module 00:06:21.374 06:15:50 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:21.374 06:15:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:21.374 06:15:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:21.374 06:15:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:21.374 06:15:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:21.374 06:15:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:21.374 06:15:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:21.374 06:15:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:21.374 06:15:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:21.374 06:15:50 -- accel/accel.sh@42 -- # jq -r . 00:06:21.374 [2024-11-27 06:15:50.653083] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:21.374 [2024-11-27 06:15:50.653174] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid22247 ] 00:06:21.374 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.374 [2024-11-27 06:15:50.723871] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:21.374 [2024-11-27 06:15:50.792588] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.755 06:15:51 -- accel/accel.sh@18 -- # out=' 00:06:22.755 SPDK Configuration: 00:06:22.755 Core mask: 0x1 00:06:22.755 00:06:22.755 Accel Perf Configuration: 00:06:22.755 Workload Type: xor 00:06:22.755 Source buffers: 3 00:06:22.755 Transfer size: 4096 bytes 00:06:22.755 Vector count 1 00:06:22.755 Module: software 00:06:22.755 Queue depth: 32 00:06:22.755 Allocate depth: 32 00:06:22.755 # threads/core: 1 00:06:22.755 Run time: 1 seconds 00:06:22.755 Verify: Yes 00:06:22.755 00:06:22.756 Running for 1 seconds... 00:06:22.756 00:06:22.756 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:22.756 ------------------------------------------------------------------------------------ 00:06:22.756 0,0 664832/s 2597 MiB/s 0 0 00:06:22.756 ==================================================================================== 00:06:22.756 Total 664832/s 2597 MiB/s 0 0' 00:06:22.756 06:15:51 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:51 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:51 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:22.756 06:15:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:22.756 06:15:51 -- accel/accel.sh@12 -- # build_accel_config 00:06:22.756 06:15:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:22.756 06:15:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.756 06:15:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.756 06:15:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:22.756 06:15:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:22.756 06:15:51 -- accel/accel.sh@41 -- # local IFS=, 00:06:22.756 06:15:51 -- accel/accel.sh@42 -- # jq -r . 00:06:22.756 [2024-11-27 06:15:51.984068] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:22.756 [2024-11-27 06:15:51.984177] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid22407 ] 00:06:22.756 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.756 [2024-11-27 06:15:52.052392] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.756 [2024-11-27 06:15:52.120898] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val= 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val= 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val=0x1 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val= 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val= 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val=xor 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val=3 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val= 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val=software 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@23 -- # accel_module=software 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val=32 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val=32 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val=1 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val=Yes 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val= 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:22.756 06:15:52 -- accel/accel.sh@21 -- # val= 00:06:22.756 06:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # IFS=: 00:06:22.756 06:15:52 -- accel/accel.sh@20 -- # read -r var val 00:06:24.136 06:15:53 -- accel/accel.sh@21 -- # val= 00:06:24.136 06:15:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.136 06:15:53 -- accel/accel.sh@20 -- # IFS=: 00:06:24.136 06:15:53 -- accel/accel.sh@20 -- # read -r var val 00:06:24.136 06:15:53 -- accel/accel.sh@21 -- # val= 00:06:24.136 06:15:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.136 06:15:53 -- accel/accel.sh@20 -- # IFS=: 00:06:24.136 06:15:53 -- accel/accel.sh@20 -- # read -r var val 00:06:24.136 06:15:53 -- accel/accel.sh@21 -- # val= 00:06:24.136 06:15:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.136 06:15:53 -- accel/accel.sh@20 -- # IFS=: 00:06:24.136 06:15:53 -- accel/accel.sh@20 -- # read -r var val 00:06:24.136 06:15:53 -- accel/accel.sh@21 -- # val= 00:06:24.136 06:15:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.136 06:15:53 -- accel/accel.sh@20 -- # IFS=: 00:06:24.136 06:15:53 -- accel/accel.sh@20 -- # read -r var val 00:06:24.136 06:15:53 -- accel/accel.sh@21 -- # val= 00:06:24.136 06:15:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.136 06:15:53 -- accel/accel.sh@20 -- # IFS=: 00:06:24.136 06:15:53 -- accel/accel.sh@20 -- # read -r var val 00:06:24.136 06:15:53 -- accel/accel.sh@21 -- # val= 00:06:24.136 06:15:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.136 06:15:53 -- accel/accel.sh@20 -- # IFS=: 00:06:24.136 06:15:53 -- accel/accel.sh@20 -- # read -r var val 00:06:24.136 06:15:53 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:24.136 06:15:53 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:24.136 06:15:53 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:24.136 00:06:24.136 real 0m2.660s 00:06:24.136 user 0m2.388s 00:06:24.136 sys 0m0.270s 00:06:24.136 06:15:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:24.136 06:15:53 -- common/autotest_common.sh@10 -- # set +x 00:06:24.136 ************************************ 00:06:24.136 END TEST accel_xor 00:06:24.136 ************************************ 00:06:24.136 06:15:53 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:24.136 06:15:53 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:24.136 06:15:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:24.136 06:15:53 -- common/autotest_common.sh@10 -- # set +x 00:06:24.136 ************************************ 00:06:24.136 START TEST accel_dif_verify 00:06:24.136 ************************************ 00:06:24.136 06:15:53 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_verify 00:06:24.136 06:15:53 -- accel/accel.sh@16 -- # local accel_opc 00:06:24.136 06:15:53 -- accel/accel.sh@17 -- # local accel_module 00:06:24.136 06:15:53 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:24.136 06:15:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:24.136 06:15:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.136 06:15:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.136 06:15:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.136 06:15:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.136 06:15:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.137 06:15:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.137 06:15:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.137 06:15:53 -- accel/accel.sh@42 -- # jq -r . 00:06:24.137 [2024-11-27 06:15:53.355641] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:24.137 [2024-11-27 06:15:53.355733] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid22618 ] 00:06:24.137 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.137 [2024-11-27 06:15:53.426105] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.137 [2024-11-27 06:15:53.493820] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.517 06:15:54 -- accel/accel.sh@18 -- # out=' 00:06:25.517 SPDK Configuration: 00:06:25.517 Core mask: 0x1 00:06:25.517 00:06:25.517 Accel Perf Configuration: 00:06:25.517 Workload Type: dif_verify 00:06:25.517 Vector size: 4096 bytes 00:06:25.517 Transfer size: 4096 bytes 00:06:25.517 Block size: 512 bytes 00:06:25.517 Metadata size: 8 bytes 00:06:25.517 Vector count 1 00:06:25.517 Module: software 00:06:25.517 Queue depth: 32 00:06:25.517 Allocate depth: 32 00:06:25.517 # threads/core: 1 00:06:25.517 Run time: 1 seconds 00:06:25.517 Verify: No 00:06:25.517 00:06:25.517 Running for 1 seconds... 00:06:25.517 00:06:25.517 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:25.517 ------------------------------------------------------------------------------------ 00:06:25.517 0,0 248672/s 986 MiB/s 0 0 00:06:25.517 ==================================================================================== 00:06:25.517 Total 248672/s 971 MiB/s 0 0' 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:25.517 06:15:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:25.517 06:15:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:25.517 06:15:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:25.517 06:15:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:25.517 06:15:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:25.517 06:15:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:25.517 06:15:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:25.517 06:15:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:25.517 06:15:54 -- accel/accel.sh@42 -- # jq -r . 00:06:25.517 [2024-11-27 06:15:54.676128] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:25.517 [2024-11-27 06:15:54.676197] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid22878 ] 00:06:25.517 EAL: No free 2048 kB hugepages reported on node 1 00:06:25.517 [2024-11-27 06:15:54.740848] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.517 [2024-11-27 06:15:54.806677] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val= 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val= 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val=0x1 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val= 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val= 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val=dif_verify 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val= 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val=software 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@23 -- # accel_module=software 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val=32 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.517 06:15:54 -- accel/accel.sh@21 -- # val=32 00:06:25.517 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.517 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.518 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.518 06:15:54 -- accel/accel.sh@21 -- # val=1 00:06:25.518 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.518 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.518 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.518 06:15:54 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:25.518 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.518 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.518 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.518 06:15:54 -- accel/accel.sh@21 -- # val=No 00:06:25.518 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.518 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.518 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.518 06:15:54 -- accel/accel.sh@21 -- # val= 00:06:25.518 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.518 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.518 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.518 06:15:54 -- accel/accel.sh@21 -- # val= 00:06:25.518 06:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.518 06:15:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.518 06:15:54 -- accel/accel.sh@20 -- # read -r var val 00:06:26.456 06:15:55 -- accel/accel.sh@21 -- # val= 00:06:26.456 06:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.456 06:15:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.456 06:15:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.456 06:15:55 -- accel/accel.sh@21 -- # val= 00:06:26.456 06:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.456 06:15:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.456 06:15:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.456 06:15:55 -- accel/accel.sh@21 -- # val= 00:06:26.456 06:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.456 06:15:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.456 06:15:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.456 06:15:55 -- accel/accel.sh@21 -- # val= 00:06:26.456 06:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.456 06:15:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.456 06:15:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.456 06:15:55 -- accel/accel.sh@21 -- # val= 00:06:26.456 06:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.456 06:15:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.456 06:15:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.456 06:15:55 -- accel/accel.sh@21 -- # val= 00:06:26.456 06:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.456 06:15:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.456 06:15:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.456 06:15:55 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:26.456 06:15:55 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:06:26.456 06:15:55 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:26.456 00:06:26.456 real 0m2.643s 00:06:26.456 user 0m2.391s 00:06:26.456 sys 0m0.250s 00:06:26.456 06:15:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:26.456 06:15:55 -- common/autotest_common.sh@10 -- # set +x 00:06:26.456 ************************************ 00:06:26.456 END TEST accel_dif_verify 00:06:26.456 ************************************ 00:06:26.716 06:15:56 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:26.716 06:15:56 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:26.716 06:15:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:26.716 06:15:56 -- common/autotest_common.sh@10 -- # set +x 00:06:26.716 ************************************ 00:06:26.716 START TEST accel_dif_generate 00:06:26.716 ************************************ 00:06:26.716 06:15:56 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate 00:06:26.716 06:15:56 -- accel/accel.sh@16 -- # local accel_opc 00:06:26.716 06:15:56 -- accel/accel.sh@17 -- # local accel_module 00:06:26.716 06:15:56 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:06:26.716 06:15:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:26.716 06:15:56 -- accel/accel.sh@12 -- # build_accel_config 00:06:26.716 06:15:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:26.716 06:15:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:26.716 06:15:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:26.716 06:15:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:26.716 06:15:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:26.716 06:15:56 -- accel/accel.sh@41 -- # local IFS=, 00:06:26.716 06:15:56 -- accel/accel.sh@42 -- # jq -r . 00:06:26.716 [2024-11-27 06:15:56.042927] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:26.716 [2024-11-27 06:15:56.043015] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid23163 ] 00:06:26.716 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.716 [2024-11-27 06:15:56.113066] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.716 [2024-11-27 06:15:56.180883] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.096 06:15:57 -- accel/accel.sh@18 -- # out=' 00:06:28.096 SPDK Configuration: 00:06:28.096 Core mask: 0x1 00:06:28.096 00:06:28.096 Accel Perf Configuration: 00:06:28.096 Workload Type: dif_generate 00:06:28.096 Vector size: 4096 bytes 00:06:28.096 Transfer size: 4096 bytes 00:06:28.096 Block size: 512 bytes 00:06:28.096 Metadata size: 8 bytes 00:06:28.096 Vector count 1 00:06:28.096 Module: software 00:06:28.096 Queue depth: 32 00:06:28.096 Allocate depth: 32 00:06:28.096 # threads/core: 1 00:06:28.096 Run time: 1 seconds 00:06:28.096 Verify: No 00:06:28.096 00:06:28.096 Running for 1 seconds... 00:06:28.096 00:06:28.096 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:28.096 ------------------------------------------------------------------------------------ 00:06:28.096 0,0 287104/s 1139 MiB/s 0 0 00:06:28.096 ==================================================================================== 00:06:28.096 Total 287104/s 1121 MiB/s 0 0' 00:06:28.096 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.096 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.096 06:15:57 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:28.096 06:15:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:28.096 06:15:57 -- accel/accel.sh@12 -- # build_accel_config 00:06:28.097 06:15:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:28.097 06:15:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:28.097 06:15:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:28.097 06:15:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:28.097 06:15:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:28.097 06:15:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:28.097 06:15:57 -- accel/accel.sh@42 -- # jq -r . 00:06:28.097 [2024-11-27 06:15:57.369321] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:28.097 [2024-11-27 06:15:57.369412] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid23432 ] 00:06:28.097 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.097 [2024-11-27 06:15:57.438264] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.097 [2024-11-27 06:15:57.504034] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val= 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val= 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val=0x1 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val= 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val= 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val=dif_generate 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val= 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val=software 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@23 -- # accel_module=software 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val=32 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val=32 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val=1 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val=No 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val= 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:28.097 06:15:57 -- accel/accel.sh@21 -- # val= 00:06:28.097 06:15:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # IFS=: 00:06:28.097 06:15:57 -- accel/accel.sh@20 -- # read -r var val 00:06:29.476 06:15:58 -- accel/accel.sh@21 -- # val= 00:06:29.476 06:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.476 06:15:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.476 06:15:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.476 06:15:58 -- accel/accel.sh@21 -- # val= 00:06:29.476 06:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.476 06:15:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.476 06:15:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.476 06:15:58 -- accel/accel.sh@21 -- # val= 00:06:29.476 06:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.476 06:15:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.476 06:15:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.476 06:15:58 -- accel/accel.sh@21 -- # val= 00:06:29.476 06:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.476 06:15:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.476 06:15:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.476 06:15:58 -- accel/accel.sh@21 -- # val= 00:06:29.476 06:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.476 06:15:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.476 06:15:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.476 06:15:58 -- accel/accel.sh@21 -- # val= 00:06:29.476 06:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.476 06:15:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.476 06:15:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.476 06:15:58 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:29.476 06:15:58 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:06:29.476 06:15:58 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:29.476 00:06:29.476 real 0m2.652s 00:06:29.476 user 0m2.393s 00:06:29.476 sys 0m0.259s 00:06:29.476 06:15:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:29.476 06:15:58 -- common/autotest_common.sh@10 -- # set +x 00:06:29.476 ************************************ 00:06:29.476 END TEST accel_dif_generate 00:06:29.476 ************************************ 00:06:29.476 06:15:58 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:29.476 06:15:58 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:29.476 06:15:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:29.476 06:15:58 -- common/autotest_common.sh@10 -- # set +x 00:06:29.476 ************************************ 00:06:29.476 START TEST accel_dif_generate_copy 00:06:29.476 ************************************ 00:06:29.476 06:15:58 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate_copy 00:06:29.476 06:15:58 -- accel/accel.sh@16 -- # local accel_opc 00:06:29.476 06:15:58 -- accel/accel.sh@17 -- # local accel_module 00:06:29.476 06:15:58 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:06:29.476 06:15:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:29.476 06:15:58 -- accel/accel.sh@12 -- # build_accel_config 00:06:29.476 06:15:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:29.476 06:15:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:29.476 06:15:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:29.476 06:15:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:29.476 06:15:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:29.476 06:15:58 -- accel/accel.sh@41 -- # local IFS=, 00:06:29.476 06:15:58 -- accel/accel.sh@42 -- # jq -r . 00:06:29.476 [2024-11-27 06:15:58.739708] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:29.476 [2024-11-27 06:15:58.739794] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid23721 ] 00:06:29.476 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.476 [2024-11-27 06:15:58.808741] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.476 [2024-11-27 06:15:58.876020] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.854 06:16:00 -- accel/accel.sh@18 -- # out=' 00:06:30.854 SPDK Configuration: 00:06:30.854 Core mask: 0x1 00:06:30.854 00:06:30.854 Accel Perf Configuration: 00:06:30.854 Workload Type: dif_generate_copy 00:06:30.854 Vector size: 4096 bytes 00:06:30.854 Transfer size: 4096 bytes 00:06:30.854 Vector count 1 00:06:30.854 Module: software 00:06:30.854 Queue depth: 32 00:06:30.854 Allocate depth: 32 00:06:30.854 # threads/core: 1 00:06:30.854 Run time: 1 seconds 00:06:30.854 Verify: No 00:06:30.854 00:06:30.854 Running for 1 seconds... 00:06:30.854 00:06:30.854 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:30.854 ------------------------------------------------------------------------------------ 00:06:30.854 0,0 225408/s 894 MiB/s 0 0 00:06:30.854 ==================================================================================== 00:06:30.854 Total 225408/s 880 MiB/s 0 0' 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:30.854 06:16:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:30.854 06:16:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:30.854 06:16:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:30.854 06:16:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:30.854 06:16:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:30.854 06:16:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:30.854 06:16:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:30.854 06:16:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:30.854 06:16:00 -- accel/accel.sh@42 -- # jq -r . 00:06:30.854 [2024-11-27 06:16:00.067549] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:30.854 [2024-11-27 06:16:00.067646] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid23987 ] 00:06:30.854 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.854 [2024-11-27 06:16:00.138030] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.854 [2024-11-27 06:16:00.207136] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val= 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val= 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val=0x1 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val= 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val= 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val= 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val=software 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@23 -- # accel_module=software 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val=32 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val=32 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val=1 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val=No 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val= 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:30.854 06:16:00 -- accel/accel.sh@21 -- # val= 00:06:30.854 06:16:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # IFS=: 00:06:30.854 06:16:00 -- accel/accel.sh@20 -- # read -r var val 00:06:32.231 06:16:01 -- accel/accel.sh@21 -- # val= 00:06:32.231 06:16:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.231 06:16:01 -- accel/accel.sh@20 -- # IFS=: 00:06:32.231 06:16:01 -- accel/accel.sh@20 -- # read -r var val 00:06:32.231 06:16:01 -- accel/accel.sh@21 -- # val= 00:06:32.231 06:16:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.231 06:16:01 -- accel/accel.sh@20 -- # IFS=: 00:06:32.231 06:16:01 -- accel/accel.sh@20 -- # read -r var val 00:06:32.231 06:16:01 -- accel/accel.sh@21 -- # val= 00:06:32.231 06:16:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.231 06:16:01 -- accel/accel.sh@20 -- # IFS=: 00:06:32.231 06:16:01 -- accel/accel.sh@20 -- # read -r var val 00:06:32.231 06:16:01 -- accel/accel.sh@21 -- # val= 00:06:32.231 06:16:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.231 06:16:01 -- accel/accel.sh@20 -- # IFS=: 00:06:32.231 06:16:01 -- accel/accel.sh@20 -- # read -r var val 00:06:32.231 06:16:01 -- accel/accel.sh@21 -- # val= 00:06:32.231 06:16:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.231 06:16:01 -- accel/accel.sh@20 -- # IFS=: 00:06:32.231 06:16:01 -- accel/accel.sh@20 -- # read -r var val 00:06:32.231 06:16:01 -- accel/accel.sh@21 -- # val= 00:06:32.231 06:16:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.231 06:16:01 -- accel/accel.sh@20 -- # IFS=: 00:06:32.231 06:16:01 -- accel/accel.sh@20 -- # read -r var val 00:06:32.231 06:16:01 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:32.231 06:16:01 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:06:32.231 06:16:01 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:32.231 00:06:32.231 real 0m2.661s 00:06:32.231 user 0m2.394s 00:06:32.231 sys 0m0.265s 00:06:32.231 06:16:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:32.231 06:16:01 -- common/autotest_common.sh@10 -- # set +x 00:06:32.231 ************************************ 00:06:32.231 END TEST accel_dif_generate_copy 00:06:32.231 ************************************ 00:06:32.231 06:16:01 -- accel/accel.sh@107 -- # [[ y == y ]] 00:06:32.231 06:16:01 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:32.231 06:16:01 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:32.231 06:16:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:32.231 06:16:01 -- common/autotest_common.sh@10 -- # set +x 00:06:32.231 ************************************ 00:06:32.231 START TEST accel_comp 00:06:32.231 ************************************ 00:06:32.231 06:16:01 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:32.231 06:16:01 -- accel/accel.sh@16 -- # local accel_opc 00:06:32.231 06:16:01 -- accel/accel.sh@17 -- # local accel_module 00:06:32.231 06:16:01 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:32.231 06:16:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:32.231 06:16:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:32.231 06:16:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:32.231 06:16:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.231 06:16:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.231 06:16:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:32.231 06:16:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:32.231 06:16:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:32.231 06:16:01 -- accel/accel.sh@42 -- # jq -r . 00:06:32.231 [2024-11-27 06:16:01.445165] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:32.231 [2024-11-27 06:16:01.445260] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid24231 ] 00:06:32.231 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.231 [2024-11-27 06:16:01.513929] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.231 [2024-11-27 06:16:01.582147] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.610 06:16:02 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:33.610 00:06:33.610 SPDK Configuration: 00:06:33.610 Core mask: 0x1 00:06:33.610 00:06:33.610 Accel Perf Configuration: 00:06:33.610 Workload Type: compress 00:06:33.610 Transfer size: 4096 bytes 00:06:33.610 Vector count 1 00:06:33.610 Module: software 00:06:33.610 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:33.610 Queue depth: 32 00:06:33.610 Allocate depth: 32 00:06:33.610 # threads/core: 1 00:06:33.610 Run time: 1 seconds 00:06:33.610 Verify: No 00:06:33.610 00:06:33.610 Running for 1 seconds... 00:06:33.610 00:06:33.610 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:33.610 ------------------------------------------------------------------------------------ 00:06:33.610 0,0 68384/s 285 MiB/s 0 0 00:06:33.610 ==================================================================================== 00:06:33.610 Total 68384/s 267 MiB/s 0 0' 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:33.610 06:16:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:33.610 06:16:02 -- accel/accel.sh@12 -- # build_accel_config 00:06:33.610 06:16:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:33.610 06:16:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.610 06:16:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.610 06:16:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:33.610 06:16:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:33.610 06:16:02 -- accel/accel.sh@41 -- # local IFS=, 00:06:33.610 06:16:02 -- accel/accel.sh@42 -- # jq -r . 00:06:33.610 [2024-11-27 06:16:02.775606] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:33.610 [2024-11-27 06:16:02.775697] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid24404 ] 00:06:33.610 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.610 [2024-11-27 06:16:02.846949] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.610 [2024-11-27 06:16:02.915996] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val= 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val= 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val= 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val=0x1 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val= 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val= 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val=compress 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@24 -- # accel_opc=compress 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val= 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val=software 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@23 -- # accel_module=software 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val=32 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val=32 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val=1 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val=No 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val= 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:33.610 06:16:02 -- accel/accel.sh@21 -- # val= 00:06:33.610 06:16:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # IFS=: 00:06:33.610 06:16:02 -- accel/accel.sh@20 -- # read -r var val 00:06:34.992 06:16:04 -- accel/accel.sh@21 -- # val= 00:06:34.992 06:16:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.992 06:16:04 -- accel/accel.sh@20 -- # IFS=: 00:06:34.992 06:16:04 -- accel/accel.sh@20 -- # read -r var val 00:06:34.992 06:16:04 -- accel/accel.sh@21 -- # val= 00:06:34.992 06:16:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.992 06:16:04 -- accel/accel.sh@20 -- # IFS=: 00:06:34.992 06:16:04 -- accel/accel.sh@20 -- # read -r var val 00:06:34.992 06:16:04 -- accel/accel.sh@21 -- # val= 00:06:34.992 06:16:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.992 06:16:04 -- accel/accel.sh@20 -- # IFS=: 00:06:34.992 06:16:04 -- accel/accel.sh@20 -- # read -r var val 00:06:34.992 06:16:04 -- accel/accel.sh@21 -- # val= 00:06:34.992 06:16:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.992 06:16:04 -- accel/accel.sh@20 -- # IFS=: 00:06:34.992 06:16:04 -- accel/accel.sh@20 -- # read -r var val 00:06:34.992 06:16:04 -- accel/accel.sh@21 -- # val= 00:06:34.992 06:16:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.992 06:16:04 -- accel/accel.sh@20 -- # IFS=: 00:06:34.992 06:16:04 -- accel/accel.sh@20 -- # read -r var val 00:06:34.992 06:16:04 -- accel/accel.sh@21 -- # val= 00:06:34.992 06:16:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.992 06:16:04 -- accel/accel.sh@20 -- # IFS=: 00:06:34.992 06:16:04 -- accel/accel.sh@20 -- # read -r var val 00:06:34.992 06:16:04 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:34.992 06:16:04 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:06:34.992 06:16:04 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:34.992 00:06:34.992 real 0m2.664s 00:06:34.992 user 0m2.412s 00:06:34.992 sys 0m0.251s 00:06:34.992 06:16:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:34.992 06:16:04 -- common/autotest_common.sh@10 -- # set +x 00:06:34.992 ************************************ 00:06:34.992 END TEST accel_comp 00:06:34.992 ************************************ 00:06:34.992 06:16:04 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:34.992 06:16:04 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:34.992 06:16:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:34.992 06:16:04 -- common/autotest_common.sh@10 -- # set +x 00:06:34.992 ************************************ 00:06:34.992 START TEST accel_decomp 00:06:34.992 ************************************ 00:06:34.992 06:16:04 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:34.992 06:16:04 -- accel/accel.sh@16 -- # local accel_opc 00:06:34.992 06:16:04 -- accel/accel.sh@17 -- # local accel_module 00:06:34.992 06:16:04 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:34.992 06:16:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:34.992 06:16:04 -- accel/accel.sh@12 -- # build_accel_config 00:06:34.992 06:16:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:34.992 06:16:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:34.992 06:16:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:34.992 06:16:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:34.992 06:16:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:34.992 06:16:04 -- accel/accel.sh@41 -- # local IFS=, 00:06:34.992 06:16:04 -- accel/accel.sh@42 -- # jq -r . 00:06:34.992 [2024-11-27 06:16:04.153400] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:34.992 [2024-11-27 06:16:04.153493] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid24603 ] 00:06:34.992 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.992 [2024-11-27 06:16:04.222226] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.992 [2024-11-27 06:16:04.290588] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.930 06:16:05 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:35.930 00:06:35.930 SPDK Configuration: 00:06:35.930 Core mask: 0x1 00:06:35.930 00:06:35.930 Accel Perf Configuration: 00:06:35.930 Workload Type: decompress 00:06:35.930 Transfer size: 4096 bytes 00:06:35.930 Vector count 1 00:06:35.930 Module: software 00:06:35.930 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:35.930 Queue depth: 32 00:06:35.930 Allocate depth: 32 00:06:35.930 # threads/core: 1 00:06:35.930 Run time: 1 seconds 00:06:35.930 Verify: Yes 00:06:35.930 00:06:35.930 Running for 1 seconds... 00:06:35.930 00:06:35.930 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:35.930 ------------------------------------------------------------------------------------ 00:06:35.930 0,0 92544/s 170 MiB/s 0 0 00:06:35.930 ==================================================================================== 00:06:35.930 Total 92544/s 361 MiB/s 0 0' 00:06:35.930 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:35.930 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:35.930 06:16:05 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:35.930 06:16:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:35.930 06:16:05 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.930 06:16:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.930 06:16:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.930 06:16:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.930 06:16:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.190 06:16:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.190 06:16:05 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.190 06:16:05 -- accel/accel.sh@42 -- # jq -r . 00:06:36.190 [2024-11-27 06:16:05.481780] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:36.190 [2024-11-27 06:16:05.481876] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid24849 ] 00:06:36.190 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.190 [2024-11-27 06:16:05.552928] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.190 [2024-11-27 06:16:05.619807] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.190 06:16:05 -- accel/accel.sh@21 -- # val= 00:06:36.190 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.190 06:16:05 -- accel/accel.sh@21 -- # val= 00:06:36.190 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.190 06:16:05 -- accel/accel.sh@21 -- # val= 00:06:36.190 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.190 06:16:05 -- accel/accel.sh@21 -- # val=0x1 00:06:36.190 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.190 06:16:05 -- accel/accel.sh@21 -- # val= 00:06:36.190 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.190 06:16:05 -- accel/accel.sh@21 -- # val= 00:06:36.190 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.190 06:16:05 -- accel/accel.sh@21 -- # val=decompress 00:06:36.190 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.190 06:16:05 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.190 06:16:05 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:36.190 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.190 06:16:05 -- accel/accel.sh@21 -- # val= 00:06:36.190 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.190 06:16:05 -- accel/accel.sh@21 -- # val=software 00:06:36.190 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.190 06:16:05 -- accel/accel.sh@23 -- # accel_module=software 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.190 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.190 06:16:05 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:36.191 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.191 06:16:05 -- accel/accel.sh@21 -- # val=32 00:06:36.191 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.191 06:16:05 -- accel/accel.sh@21 -- # val=32 00:06:36.191 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.191 06:16:05 -- accel/accel.sh@21 -- # val=1 00:06:36.191 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.191 06:16:05 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:36.191 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.191 06:16:05 -- accel/accel.sh@21 -- # val=Yes 00:06:36.191 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.191 06:16:05 -- accel/accel.sh@21 -- # val= 00:06:36.191 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:36.191 06:16:05 -- accel/accel.sh@21 -- # val= 00:06:36.191 06:16:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # IFS=: 00:06:36.191 06:16:05 -- accel/accel.sh@20 -- # read -r var val 00:06:37.571 06:16:06 -- accel/accel.sh@21 -- # val= 00:06:37.571 06:16:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.571 06:16:06 -- accel/accel.sh@20 -- # IFS=: 00:06:37.571 06:16:06 -- accel/accel.sh@20 -- # read -r var val 00:06:37.571 06:16:06 -- accel/accel.sh@21 -- # val= 00:06:37.571 06:16:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.571 06:16:06 -- accel/accel.sh@20 -- # IFS=: 00:06:37.571 06:16:06 -- accel/accel.sh@20 -- # read -r var val 00:06:37.571 06:16:06 -- accel/accel.sh@21 -- # val= 00:06:37.571 06:16:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.571 06:16:06 -- accel/accel.sh@20 -- # IFS=: 00:06:37.571 06:16:06 -- accel/accel.sh@20 -- # read -r var val 00:06:37.571 06:16:06 -- accel/accel.sh@21 -- # val= 00:06:37.571 06:16:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.571 06:16:06 -- accel/accel.sh@20 -- # IFS=: 00:06:37.571 06:16:06 -- accel/accel.sh@20 -- # read -r var val 00:06:37.571 06:16:06 -- accel/accel.sh@21 -- # val= 00:06:37.571 06:16:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.571 06:16:06 -- accel/accel.sh@20 -- # IFS=: 00:06:37.571 06:16:06 -- accel/accel.sh@20 -- # read -r var val 00:06:37.571 06:16:06 -- accel/accel.sh@21 -- # val= 00:06:37.571 06:16:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.571 06:16:06 -- accel/accel.sh@20 -- # IFS=: 00:06:37.571 06:16:06 -- accel/accel.sh@20 -- # read -r var val 00:06:37.571 06:16:06 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:37.572 06:16:06 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:37.572 06:16:06 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:37.572 00:06:37.572 real 0m2.659s 00:06:37.572 user 0m2.392s 00:06:37.572 sys 0m0.265s 00:06:37.572 06:16:06 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:37.572 06:16:06 -- common/autotest_common.sh@10 -- # set +x 00:06:37.572 ************************************ 00:06:37.572 END TEST accel_decomp 00:06:37.572 ************************************ 00:06:37.572 06:16:06 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:37.572 06:16:06 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:37.572 06:16:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:37.572 06:16:06 -- common/autotest_common.sh@10 -- # set +x 00:06:37.572 ************************************ 00:06:37.572 START TEST accel_decmop_full 00:06:37.572 ************************************ 00:06:37.572 06:16:06 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:37.572 06:16:06 -- accel/accel.sh@16 -- # local accel_opc 00:06:37.572 06:16:06 -- accel/accel.sh@17 -- # local accel_module 00:06:37.572 06:16:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:37.572 06:16:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:37.572 06:16:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.572 06:16:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.572 06:16:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.572 06:16:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.572 06:16:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.572 06:16:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.572 06:16:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.572 06:16:06 -- accel/accel.sh@42 -- # jq -r . 00:06:37.572 [2024-11-27 06:16:06.856835] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:37.572 [2024-11-27 06:16:06.856927] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid25136 ] 00:06:37.572 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.572 [2024-11-27 06:16:06.927310] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.572 [2024-11-27 06:16:06.995659] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.951 06:16:08 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:38.951 00:06:38.951 SPDK Configuration: 00:06:38.951 Core mask: 0x1 00:06:38.951 00:06:38.951 Accel Perf Configuration: 00:06:38.951 Workload Type: decompress 00:06:38.951 Transfer size: 111250 bytes 00:06:38.951 Vector count 1 00:06:38.951 Module: software 00:06:38.951 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:38.951 Queue depth: 32 00:06:38.951 Allocate depth: 32 00:06:38.951 # threads/core: 1 00:06:38.951 Run time: 1 seconds 00:06:38.951 Verify: Yes 00:06:38.951 00:06:38.951 Running for 1 seconds... 00:06:38.951 00:06:38.951 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:38.951 ------------------------------------------------------------------------------------ 00:06:38.951 0,0 5856/s 241 MiB/s 0 0 00:06:38.951 ==================================================================================== 00:06:38.951 Total 5856/s 621 MiB/s 0 0' 00:06:38.951 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.951 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.951 06:16:08 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:38.951 06:16:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:38.951 06:16:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.951 06:16:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.951 06:16:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.951 06:16:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.951 06:16:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.951 06:16:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.951 06:16:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.951 06:16:08 -- accel/accel.sh@42 -- # jq -r . 00:06:38.951 [2024-11-27 06:16:08.194091] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:38.952 [2024-11-27 06:16:08.194180] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid25408 ] 00:06:38.952 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.952 [2024-11-27 06:16:08.262455] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.952 [2024-11-27 06:16:08.329291] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val= 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val= 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val= 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val=0x1 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val= 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val= 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val=decompress 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val= 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val=software 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@23 -- # accel_module=software 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val=32 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val=32 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val=1 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val=Yes 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val= 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:38.952 06:16:08 -- accel/accel.sh@21 -- # val= 00:06:38.952 06:16:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # IFS=: 00:06:38.952 06:16:08 -- accel/accel.sh@20 -- # read -r var val 00:06:40.332 06:16:09 -- accel/accel.sh@21 -- # val= 00:06:40.332 06:16:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.332 06:16:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.332 06:16:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.332 06:16:09 -- accel/accel.sh@21 -- # val= 00:06:40.332 06:16:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.332 06:16:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.332 06:16:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.332 06:16:09 -- accel/accel.sh@21 -- # val= 00:06:40.332 06:16:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.332 06:16:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.332 06:16:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.332 06:16:09 -- accel/accel.sh@21 -- # val= 00:06:40.332 06:16:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.332 06:16:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.332 06:16:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.332 06:16:09 -- accel/accel.sh@21 -- # val= 00:06:40.332 06:16:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.332 06:16:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.332 06:16:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.332 06:16:09 -- accel/accel.sh@21 -- # val= 00:06:40.332 06:16:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.332 06:16:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.332 06:16:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.332 06:16:09 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:40.332 06:16:09 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:40.332 06:16:09 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:40.332 00:06:40.332 real 0m2.672s 00:06:40.332 user 0m2.411s 00:06:40.332 sys 0m0.256s 00:06:40.332 06:16:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:40.332 06:16:09 -- common/autotest_common.sh@10 -- # set +x 00:06:40.332 ************************************ 00:06:40.332 END TEST accel_decmop_full 00:06:40.332 ************************************ 00:06:40.332 06:16:09 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:40.332 06:16:09 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:40.332 06:16:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:40.332 06:16:09 -- common/autotest_common.sh@10 -- # set +x 00:06:40.332 ************************************ 00:06:40.332 START TEST accel_decomp_mcore 00:06:40.332 ************************************ 00:06:40.332 06:16:09 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:40.332 06:16:09 -- accel/accel.sh@16 -- # local accel_opc 00:06:40.332 06:16:09 -- accel/accel.sh@17 -- # local accel_module 00:06:40.332 06:16:09 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:40.332 06:16:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:40.332 06:16:09 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.332 06:16:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.332 06:16:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.332 06:16:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.332 06:16:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.332 06:16:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.332 06:16:09 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.332 06:16:09 -- accel/accel.sh@42 -- # jq -r . 00:06:40.332 [2024-11-27 06:16:09.571211] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:40.332 [2024-11-27 06:16:09.571286] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid25689 ] 00:06:40.332 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.332 [2024-11-27 06:16:09.639453] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:40.332 [2024-11-27 06:16:09.709269] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:40.332 [2024-11-27 06:16:09.709366] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:40.332 [2024-11-27 06:16:09.709453] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:40.332 [2024-11-27 06:16:09.709455] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.712 06:16:10 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:41.712 00:06:41.712 SPDK Configuration: 00:06:41.712 Core mask: 0xf 00:06:41.712 00:06:41.712 Accel Perf Configuration: 00:06:41.712 Workload Type: decompress 00:06:41.712 Transfer size: 4096 bytes 00:06:41.712 Vector count 1 00:06:41.712 Module: software 00:06:41.712 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:41.712 Queue depth: 32 00:06:41.712 Allocate depth: 32 00:06:41.712 # threads/core: 1 00:06:41.712 Run time: 1 seconds 00:06:41.712 Verify: Yes 00:06:41.712 00:06:41.712 Running for 1 seconds... 00:06:41.712 00:06:41.712 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:41.712 ------------------------------------------------------------------------------------ 00:06:41.712 0,0 76000/s 140 MiB/s 0 0 00:06:41.712 3,0 76640/s 141 MiB/s 0 0 00:06:41.712 2,0 76288/s 140 MiB/s 0 0 00:06:41.712 1,0 76352/s 140 MiB/s 0 0 00:06:41.712 ==================================================================================== 00:06:41.712 Total 305280/s 1192 MiB/s 0 0' 00:06:41.712 06:16:10 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:10 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:41.712 06:16:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:41.712 06:16:10 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.712 06:16:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.712 06:16:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.712 06:16:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.712 06:16:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.712 06:16:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.712 06:16:10 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.712 06:16:10 -- accel/accel.sh@42 -- # jq -r . 00:06:41.712 [2024-11-27 06:16:10.909883] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:41.712 [2024-11-27 06:16:10.909974] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid25970 ] 00:06:41.712 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.712 [2024-11-27 06:16:10.978799] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:41.712 [2024-11-27 06:16:11.047522] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:41.712 [2024-11-27 06:16:11.047627] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:41.712 [2024-11-27 06:16:11.047701] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:41.712 [2024-11-27 06:16:11.047703] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val= 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val= 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val= 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val=0xf 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val= 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val= 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val=decompress 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val= 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val=software 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@23 -- # accel_module=software 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val=32 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val=32 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.712 06:16:11 -- accel/accel.sh@21 -- # val=1 00:06:41.712 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.712 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.713 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.713 06:16:11 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:41.713 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.713 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.713 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.713 06:16:11 -- accel/accel.sh@21 -- # val=Yes 00:06:41.713 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.713 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.713 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.713 06:16:11 -- accel/accel.sh@21 -- # val= 00:06:41.713 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.713 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.713 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.713 06:16:11 -- accel/accel.sh@21 -- # val= 00:06:41.713 06:16:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.713 06:16:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.713 06:16:11 -- accel/accel.sh@20 -- # read -r var val 00:06:42.741 06:16:12 -- accel/accel.sh@21 -- # val= 00:06:42.741 06:16:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.741 06:16:12 -- accel/accel.sh@21 -- # val= 00:06:42.741 06:16:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.741 06:16:12 -- accel/accel.sh@21 -- # val= 00:06:42.741 06:16:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.741 06:16:12 -- accel/accel.sh@21 -- # val= 00:06:42.741 06:16:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.741 06:16:12 -- accel/accel.sh@21 -- # val= 00:06:42.741 06:16:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.741 06:16:12 -- accel/accel.sh@21 -- # val= 00:06:42.741 06:16:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.741 06:16:12 -- accel/accel.sh@21 -- # val= 00:06:42.741 06:16:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.741 06:16:12 -- accel/accel.sh@21 -- # val= 00:06:42.741 06:16:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.741 06:16:12 -- accel/accel.sh@21 -- # val= 00:06:42.741 06:16:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # IFS=: 00:06:42.741 06:16:12 -- accel/accel.sh@20 -- # read -r var val 00:06:42.741 06:16:12 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:42.741 06:16:12 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:42.741 06:16:12 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:42.741 00:06:42.741 real 0m2.684s 00:06:42.741 user 0m9.073s 00:06:42.741 sys 0m0.275s 00:06:42.741 06:16:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:42.741 06:16:12 -- common/autotest_common.sh@10 -- # set +x 00:06:42.741 ************************************ 00:06:42.741 END TEST accel_decomp_mcore 00:06:42.741 ************************************ 00:06:43.000 06:16:12 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:43.001 06:16:12 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:43.001 06:16:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:43.001 06:16:12 -- common/autotest_common.sh@10 -- # set +x 00:06:43.001 ************************************ 00:06:43.001 START TEST accel_decomp_full_mcore 00:06:43.001 ************************************ 00:06:43.001 06:16:12 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:43.001 06:16:12 -- accel/accel.sh@16 -- # local accel_opc 00:06:43.001 06:16:12 -- accel/accel.sh@17 -- # local accel_module 00:06:43.001 06:16:12 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:43.001 06:16:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:43.001 06:16:12 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.001 06:16:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.001 06:16:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.001 06:16:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.001 06:16:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.001 06:16:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.001 06:16:12 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.001 06:16:12 -- accel/accel.sh@42 -- # jq -r . 00:06:43.001 [2024-11-27 06:16:12.305396] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:43.001 [2024-11-27 06:16:12.305489] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid26230 ] 00:06:43.001 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.001 [2024-11-27 06:16:12.375911] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:43.001 [2024-11-27 06:16:12.447564] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:43.001 [2024-11-27 06:16:12.447645] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:43.001 [2024-11-27 06:16:12.447690] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:43.001 [2024-11-27 06:16:12.447692] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.379 06:16:13 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:44.379 00:06:44.379 SPDK Configuration: 00:06:44.379 Core mask: 0xf 00:06:44.379 00:06:44.379 Accel Perf Configuration: 00:06:44.379 Workload Type: decompress 00:06:44.379 Transfer size: 111250 bytes 00:06:44.379 Vector count 1 00:06:44.379 Module: software 00:06:44.380 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:44.380 Queue depth: 32 00:06:44.380 Allocate depth: 32 00:06:44.380 # threads/core: 1 00:06:44.380 Run time: 1 seconds 00:06:44.380 Verify: Yes 00:06:44.380 00:06:44.380 Running for 1 seconds... 00:06:44.380 00:06:44.380 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:44.380 ------------------------------------------------------------------------------------ 00:06:44.380 0,0 5792/s 239 MiB/s 0 0 00:06:44.380 3,0 5824/s 240 MiB/s 0 0 00:06:44.380 2,0 5824/s 240 MiB/s 0 0 00:06:44.380 1,0 5824/s 240 MiB/s 0 0 00:06:44.380 ==================================================================================== 00:06:44.380 Total 23264/s 2468 MiB/s 0 0' 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:44.380 06:16:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:44.380 06:16:13 -- accel/accel.sh@12 -- # build_accel_config 00:06:44.380 06:16:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:44.380 06:16:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.380 06:16:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.380 06:16:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:44.380 06:16:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:44.380 06:16:13 -- accel/accel.sh@41 -- # local IFS=, 00:06:44.380 06:16:13 -- accel/accel.sh@42 -- # jq -r . 00:06:44.380 [2024-11-27 06:16:13.655432] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:44.380 [2024-11-27 06:16:13.655531] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid26405 ] 00:06:44.380 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.380 [2024-11-27 06:16:13.723887] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:44.380 [2024-11-27 06:16:13.793782] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:44.380 [2024-11-27 06:16:13.793880] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:44.380 [2024-11-27 06:16:13.793962] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:44.380 [2024-11-27 06:16:13.793964] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val= 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val= 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val= 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val=0xf 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val= 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val= 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val=decompress 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val= 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val=software 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@23 -- # accel_module=software 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val=32 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val=32 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val=1 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val=Yes 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val= 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.380 06:16:13 -- accel/accel.sh@21 -- # val= 00:06:44.380 06:16:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.380 06:16:13 -- accel/accel.sh@20 -- # read -r var val 00:06:45.761 06:16:14 -- accel/accel.sh@21 -- # val= 00:06:45.761 06:16:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.761 06:16:14 -- accel/accel.sh@21 -- # val= 00:06:45.761 06:16:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.761 06:16:14 -- accel/accel.sh@21 -- # val= 00:06:45.761 06:16:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.761 06:16:14 -- accel/accel.sh@21 -- # val= 00:06:45.761 06:16:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.761 06:16:14 -- accel/accel.sh@21 -- # val= 00:06:45.761 06:16:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.761 06:16:14 -- accel/accel.sh@21 -- # val= 00:06:45.761 06:16:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.761 06:16:14 -- accel/accel.sh@21 -- # val= 00:06:45.761 06:16:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.761 06:16:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.761 06:16:14 -- accel/accel.sh@21 -- # val= 00:06:45.762 06:16:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.762 06:16:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.762 06:16:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.762 06:16:14 -- accel/accel.sh@21 -- # val= 00:06:45.762 06:16:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.762 06:16:14 -- accel/accel.sh@20 -- # IFS=: 00:06:45.762 06:16:14 -- accel/accel.sh@20 -- # read -r var val 00:06:45.762 06:16:14 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:45.762 06:16:14 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:45.762 06:16:14 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:45.762 00:06:45.762 real 0m2.707s 00:06:45.762 user 0m9.141s 00:06:45.762 sys 0m0.273s 00:06:45.762 06:16:14 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:45.762 06:16:14 -- common/autotest_common.sh@10 -- # set +x 00:06:45.762 ************************************ 00:06:45.762 END TEST accel_decomp_full_mcore 00:06:45.762 ************************************ 00:06:45.762 06:16:15 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:45.762 06:16:15 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:45.762 06:16:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:45.762 06:16:15 -- common/autotest_common.sh@10 -- # set +x 00:06:45.762 ************************************ 00:06:45.762 START TEST accel_decomp_mthread 00:06:45.762 ************************************ 00:06:45.762 06:16:15 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:45.762 06:16:15 -- accel/accel.sh@16 -- # local accel_opc 00:06:45.762 06:16:15 -- accel/accel.sh@17 -- # local accel_module 00:06:45.762 06:16:15 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:45.762 06:16:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:45.762 06:16:15 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.762 06:16:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.762 06:16:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.762 06:16:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.762 06:16:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.762 06:16:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.762 06:16:15 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.762 06:16:15 -- accel/accel.sh@42 -- # jq -r . 00:06:45.762 [2024-11-27 06:16:15.061138] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:45.762 [2024-11-27 06:16:15.061216] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid26618 ] 00:06:45.762 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.762 [2024-11-27 06:16:15.130102] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.762 [2024-11-27 06:16:15.198834] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.143 06:16:16 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:47.143 00:06:47.143 SPDK Configuration: 00:06:47.143 Core mask: 0x1 00:06:47.143 00:06:47.143 Accel Perf Configuration: 00:06:47.144 Workload Type: decompress 00:06:47.144 Transfer size: 4096 bytes 00:06:47.144 Vector count 1 00:06:47.144 Module: software 00:06:47.144 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:47.144 Queue depth: 32 00:06:47.144 Allocate depth: 32 00:06:47.144 # threads/core: 2 00:06:47.144 Run time: 1 seconds 00:06:47.144 Verify: Yes 00:06:47.144 00:06:47.144 Running for 1 seconds... 00:06:47.144 00:06:47.144 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:47.144 ------------------------------------------------------------------------------------ 00:06:47.144 0,1 45312/s 83 MiB/s 0 0 00:06:47.144 0,0 45184/s 83 MiB/s 0 0 00:06:47.144 ==================================================================================== 00:06:47.144 Total 90496/s 353 MiB/s 0 0' 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:47.144 06:16:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:47.144 06:16:16 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.144 06:16:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.144 06:16:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.144 06:16:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.144 06:16:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.144 06:16:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.144 06:16:16 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.144 06:16:16 -- accel/accel.sh@42 -- # jq -r . 00:06:47.144 [2024-11-27 06:16:16.392692] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:47.144 [2024-11-27 06:16:16.392782] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid26842 ] 00:06:47.144 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.144 [2024-11-27 06:16:16.461377] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.144 [2024-11-27 06:16:16.529526] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val= 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val= 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val= 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val=0x1 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val= 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val= 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val=decompress 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val= 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val=software 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@23 -- # accel_module=software 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val=32 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val=32 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val=2 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val=Yes 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val= 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.144 06:16:16 -- accel/accel.sh@21 -- # val= 00:06:47.144 06:16:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.144 06:16:16 -- accel/accel.sh@20 -- # read -r var val 00:06:48.525 06:16:17 -- accel/accel.sh@21 -- # val= 00:06:48.525 06:16:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.525 06:16:17 -- accel/accel.sh@21 -- # val= 00:06:48.525 06:16:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.525 06:16:17 -- accel/accel.sh@21 -- # val= 00:06:48.525 06:16:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.525 06:16:17 -- accel/accel.sh@21 -- # val= 00:06:48.525 06:16:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.525 06:16:17 -- accel/accel.sh@21 -- # val= 00:06:48.525 06:16:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.525 06:16:17 -- accel/accel.sh@21 -- # val= 00:06:48.525 06:16:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.525 06:16:17 -- accel/accel.sh@21 -- # val= 00:06:48.525 06:16:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.525 06:16:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.525 06:16:17 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:48.525 06:16:17 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:48.525 06:16:17 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:48.525 00:06:48.525 real 0m2.671s 00:06:48.525 user 0m2.427s 00:06:48.525 sys 0m0.255s 00:06:48.525 06:16:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:48.525 06:16:17 -- common/autotest_common.sh@10 -- # set +x 00:06:48.525 ************************************ 00:06:48.525 END TEST accel_decomp_mthread 00:06:48.525 ************************************ 00:06:48.525 06:16:17 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:48.525 06:16:17 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:48.525 06:16:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:48.525 06:16:17 -- common/autotest_common.sh@10 -- # set +x 00:06:48.525 ************************************ 00:06:48.525 START TEST accel_deomp_full_mthread 00:06:48.525 ************************************ 00:06:48.525 06:16:17 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:48.525 06:16:17 -- accel/accel.sh@16 -- # local accel_opc 00:06:48.525 06:16:17 -- accel/accel.sh@17 -- # local accel_module 00:06:48.525 06:16:17 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:48.525 06:16:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:48.525 06:16:17 -- accel/accel.sh@12 -- # build_accel_config 00:06:48.525 06:16:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:48.525 06:16:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.525 06:16:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.525 06:16:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:48.525 06:16:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:48.525 06:16:17 -- accel/accel.sh@41 -- # local IFS=, 00:06:48.525 06:16:17 -- accel/accel.sh@42 -- # jq -r . 00:06:48.525 [2024-11-27 06:16:17.781078] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:48.525 [2024-11-27 06:16:17.781153] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid27129 ] 00:06:48.525 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.525 [2024-11-27 06:16:17.849248] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.525 [2024-11-27 06:16:17.917006] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.903 06:16:19 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:49.903 00:06:49.903 SPDK Configuration: 00:06:49.903 Core mask: 0x1 00:06:49.903 00:06:49.903 Accel Perf Configuration: 00:06:49.903 Workload Type: decompress 00:06:49.903 Transfer size: 111250 bytes 00:06:49.903 Vector count 1 00:06:49.903 Module: software 00:06:49.903 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:49.903 Queue depth: 32 00:06:49.903 Allocate depth: 32 00:06:49.903 # threads/core: 2 00:06:49.903 Run time: 1 seconds 00:06:49.903 Verify: Yes 00:06:49.903 00:06:49.903 Running for 1 seconds... 00:06:49.903 00:06:49.903 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:49.903 ------------------------------------------------------------------------------------ 00:06:49.903 0,1 2944/s 121 MiB/s 0 0 00:06:49.903 0,0 2944/s 121 MiB/s 0 0 00:06:49.903 ==================================================================================== 00:06:49.903 Total 5888/s 624 MiB/s 0 0' 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:49.903 06:16:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:49.903 06:16:19 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.903 06:16:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.903 06:16:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.903 06:16:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.903 06:16:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.903 06:16:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.903 06:16:19 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.903 06:16:19 -- accel/accel.sh@42 -- # jq -r . 00:06:49.903 [2024-11-27 06:16:19.130043] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:49.903 [2024-11-27 06:16:19.130133] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid27402 ] 00:06:49.903 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.903 [2024-11-27 06:16:19.199724] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.903 [2024-11-27 06:16:19.265502] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val= 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val= 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val= 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val=0x1 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val= 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val= 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val=decompress 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val= 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val=software 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@23 -- # accel_module=software 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val=32 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val=32 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val=2 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val=Yes 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val= 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.903 06:16:19 -- accel/accel.sh@21 -- # val= 00:06:49.903 06:16:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # IFS=: 00:06:49.903 06:16:19 -- accel/accel.sh@20 -- # read -r var val 00:06:51.281 06:16:20 -- accel/accel.sh@21 -- # val= 00:06:51.281 06:16:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # IFS=: 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # read -r var val 00:06:51.281 06:16:20 -- accel/accel.sh@21 -- # val= 00:06:51.281 06:16:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # IFS=: 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # read -r var val 00:06:51.281 06:16:20 -- accel/accel.sh@21 -- # val= 00:06:51.281 06:16:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # IFS=: 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # read -r var val 00:06:51.281 06:16:20 -- accel/accel.sh@21 -- # val= 00:06:51.281 06:16:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # IFS=: 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # read -r var val 00:06:51.281 06:16:20 -- accel/accel.sh@21 -- # val= 00:06:51.281 06:16:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # IFS=: 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # read -r var val 00:06:51.281 06:16:20 -- accel/accel.sh@21 -- # val= 00:06:51.281 06:16:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # IFS=: 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # read -r var val 00:06:51.281 06:16:20 -- accel/accel.sh@21 -- # val= 00:06:51.281 06:16:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # IFS=: 00:06:51.281 06:16:20 -- accel/accel.sh@20 -- # read -r var val 00:06:51.281 06:16:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:51.281 06:16:20 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:51.281 06:16:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:51.281 00:06:51.281 real 0m2.707s 00:06:51.281 user 0m2.463s 00:06:51.281 sys 0m0.249s 00:06:51.281 06:16:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:51.281 06:16:20 -- common/autotest_common.sh@10 -- # set +x 00:06:51.281 ************************************ 00:06:51.281 END TEST accel_deomp_full_mthread 00:06:51.281 ************************************ 00:06:51.281 06:16:20 -- accel/accel.sh@116 -- # [[ n == y ]] 00:06:51.281 06:16:20 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:51.281 06:16:20 -- accel/accel.sh@129 -- # build_accel_config 00:06:51.281 06:16:20 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:06:51.281 06:16:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:51.281 06:16:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:51.281 06:16:20 -- common/autotest_common.sh@10 -- # set +x 00:06:51.281 06:16:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.281 06:16:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.281 06:16:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:51.281 06:16:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:51.281 06:16:20 -- accel/accel.sh@41 -- # local IFS=, 00:06:51.281 06:16:20 -- accel/accel.sh@42 -- # jq -r . 00:06:51.281 ************************************ 00:06:51.281 START TEST accel_dif_functional_tests 00:06:51.281 ************************************ 00:06:51.281 06:16:20 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:51.282 [2024-11-27 06:16:20.539936] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:51.282 [2024-11-27 06:16:20.540027] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid27686 ] 00:06:51.282 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.282 [2024-11-27 06:16:20.609369] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:51.282 [2024-11-27 06:16:20.676350] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:51.282 [2024-11-27 06:16:20.676441] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.282 [2024-11-27 06:16:20.676442] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:51.282 00:06:51.282 00:06:51.282 CUnit - A unit testing framework for C - Version 2.1-3 00:06:51.282 http://cunit.sourceforge.net/ 00:06:51.282 00:06:51.282 00:06:51.282 Suite: accel_dif 00:06:51.282 Test: verify: DIF generated, GUARD check ...passed 00:06:51.282 Test: verify: DIF generated, APPTAG check ...passed 00:06:51.282 Test: verify: DIF generated, REFTAG check ...passed 00:06:51.282 Test: verify: DIF not generated, GUARD check ...[2024-11-27 06:16:20.744922] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:51.282 [2024-11-27 06:16:20.744974] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:51.282 passed 00:06:51.282 Test: verify: DIF not generated, APPTAG check ...[2024-11-27 06:16:20.745026] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:51.282 [2024-11-27 06:16:20.745045] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:51.282 passed 00:06:51.282 Test: verify: DIF not generated, REFTAG check ...[2024-11-27 06:16:20.745068] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:51.282 [2024-11-27 06:16:20.745092] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:51.282 passed 00:06:51.282 Test: verify: APPTAG correct, APPTAG check ...passed 00:06:51.282 Test: verify: APPTAG incorrect, APPTAG check ...[2024-11-27 06:16:20.745137] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:06:51.282 passed 00:06:51.282 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:06:51.282 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:06:51.282 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:06:51.282 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-11-27 06:16:20.745239] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:06:51.282 passed 00:06:51.282 Test: generate copy: DIF generated, GUARD check ...passed 00:06:51.282 Test: generate copy: DIF generated, APTTAG check ...passed 00:06:51.282 Test: generate copy: DIF generated, REFTAG check ...passed 00:06:51.282 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:06:51.282 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:06:51.282 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:06:51.282 Test: generate copy: iovecs-len validate ...[2024-11-27 06:16:20.745424] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:06:51.282 passed 00:06:51.282 Test: generate copy: buffer alignment validate ...passed 00:06:51.282 00:06:51.282 Run Summary: Type Total Ran Passed Failed Inactive 00:06:51.282 suites 1 1 n/a 0 0 00:06:51.282 tests 20 20 20 0 0 00:06:51.282 asserts 204 204 204 0 n/a 00:06:51.282 00:06:51.282 Elapsed time = 0.000 seconds 00:06:51.541 00:06:51.541 real 0m0.391s 00:06:51.541 user 0m0.590s 00:06:51.541 sys 0m0.156s 00:06:51.541 06:16:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:51.541 06:16:20 -- common/autotest_common.sh@10 -- # set +x 00:06:51.541 ************************************ 00:06:51.541 END TEST accel_dif_functional_tests 00:06:51.541 ************************************ 00:06:51.541 00:06:51.541 real 0m56.998s 00:06:51.541 user 1m4.551s 00:06:51.541 sys 0m6.994s 00:06:51.541 06:16:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:51.541 06:16:20 -- common/autotest_common.sh@10 -- # set +x 00:06:51.541 ************************************ 00:06:51.541 END TEST accel 00:06:51.541 ************************************ 00:06:51.541 06:16:20 -- spdk/autotest.sh@177 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:51.541 06:16:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:51.541 06:16:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:51.541 06:16:20 -- common/autotest_common.sh@10 -- # set +x 00:06:51.541 ************************************ 00:06:51.541 START TEST accel_rpc 00:06:51.541 ************************************ 00:06:51.541 06:16:20 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:51.800 * Looking for test storage... 00:06:51.800 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:51.800 06:16:21 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:51.800 06:16:21 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:51.800 06:16:21 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:51.800 06:16:21 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:51.800 06:16:21 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:51.800 06:16:21 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:51.800 06:16:21 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:51.800 06:16:21 -- scripts/common.sh@335 -- # IFS=.-: 00:06:51.800 06:16:21 -- scripts/common.sh@335 -- # read -ra ver1 00:06:51.800 06:16:21 -- scripts/common.sh@336 -- # IFS=.-: 00:06:51.801 06:16:21 -- scripts/common.sh@336 -- # read -ra ver2 00:06:51.801 06:16:21 -- scripts/common.sh@337 -- # local 'op=<' 00:06:51.801 06:16:21 -- scripts/common.sh@339 -- # ver1_l=2 00:06:51.801 06:16:21 -- scripts/common.sh@340 -- # ver2_l=1 00:06:51.801 06:16:21 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:51.801 06:16:21 -- scripts/common.sh@343 -- # case "$op" in 00:06:51.801 06:16:21 -- scripts/common.sh@344 -- # : 1 00:06:51.801 06:16:21 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:51.801 06:16:21 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:51.801 06:16:21 -- scripts/common.sh@364 -- # decimal 1 00:06:51.801 06:16:21 -- scripts/common.sh@352 -- # local d=1 00:06:51.801 06:16:21 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:51.801 06:16:21 -- scripts/common.sh@354 -- # echo 1 00:06:51.801 06:16:21 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:51.801 06:16:21 -- scripts/common.sh@365 -- # decimal 2 00:06:51.801 06:16:21 -- scripts/common.sh@352 -- # local d=2 00:06:51.801 06:16:21 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:51.801 06:16:21 -- scripts/common.sh@354 -- # echo 2 00:06:51.801 06:16:21 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:51.801 06:16:21 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:51.801 06:16:21 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:51.801 06:16:21 -- scripts/common.sh@367 -- # return 0 00:06:51.801 06:16:21 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:51.801 06:16:21 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:51.801 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.801 --rc genhtml_branch_coverage=1 00:06:51.801 --rc genhtml_function_coverage=1 00:06:51.801 --rc genhtml_legend=1 00:06:51.801 --rc geninfo_all_blocks=1 00:06:51.801 --rc geninfo_unexecuted_blocks=1 00:06:51.801 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.801 ' 00:06:51.801 06:16:21 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:51.801 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.801 --rc genhtml_branch_coverage=1 00:06:51.801 --rc genhtml_function_coverage=1 00:06:51.801 --rc genhtml_legend=1 00:06:51.801 --rc geninfo_all_blocks=1 00:06:51.801 --rc geninfo_unexecuted_blocks=1 00:06:51.801 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.801 ' 00:06:51.801 06:16:21 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:51.801 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.801 --rc genhtml_branch_coverage=1 00:06:51.801 --rc genhtml_function_coverage=1 00:06:51.801 --rc genhtml_legend=1 00:06:51.801 --rc geninfo_all_blocks=1 00:06:51.801 --rc geninfo_unexecuted_blocks=1 00:06:51.801 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.801 ' 00:06:51.801 06:16:21 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:51.801 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.801 --rc genhtml_branch_coverage=1 00:06:51.801 --rc genhtml_function_coverage=1 00:06:51.801 --rc genhtml_legend=1 00:06:51.801 --rc geninfo_all_blocks=1 00:06:51.801 --rc geninfo_unexecuted_blocks=1 00:06:51.801 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.801 ' 00:06:51.801 06:16:21 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:51.801 06:16:21 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=27785 00:06:51.801 06:16:21 -- accel/accel_rpc.sh@15 -- # waitforlisten 27785 00:06:51.801 06:16:21 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:06:51.801 06:16:21 -- common/autotest_common.sh@829 -- # '[' -z 27785 ']' 00:06:51.801 06:16:21 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:51.801 06:16:21 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:51.801 06:16:21 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:51.801 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:51.801 06:16:21 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:51.801 06:16:21 -- common/autotest_common.sh@10 -- # set +x 00:06:51.801 [2024-11-27 06:16:21.211515] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:51.801 [2024-11-27 06:16:21.211584] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid27785 ] 00:06:51.801 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.801 [2024-11-27 06:16:21.279222] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.060 [2024-11-27 06:16:21.351545] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:52.060 [2024-11-27 06:16:21.351672] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.628 06:16:22 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:52.628 06:16:22 -- common/autotest_common.sh@862 -- # return 0 00:06:52.628 06:16:22 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:06:52.628 06:16:22 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:06:52.628 06:16:22 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:06:52.628 06:16:22 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:06:52.628 06:16:22 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:06:52.628 06:16:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:52.628 06:16:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:52.628 06:16:22 -- common/autotest_common.sh@10 -- # set +x 00:06:52.628 ************************************ 00:06:52.628 START TEST accel_assign_opcode 00:06:52.628 ************************************ 00:06:52.628 06:16:22 -- common/autotest_common.sh@1114 -- # accel_assign_opcode_test_suite 00:06:52.628 06:16:22 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:06:52.628 06:16:22 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.628 06:16:22 -- common/autotest_common.sh@10 -- # set +x 00:06:52.628 [2024-11-27 06:16:22.049736] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:06:52.628 06:16:22 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.628 06:16:22 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:06:52.628 06:16:22 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.628 06:16:22 -- common/autotest_common.sh@10 -- # set +x 00:06:52.628 [2024-11-27 06:16:22.057747] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:06:52.628 06:16:22 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.628 06:16:22 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:06:52.628 06:16:22 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.628 06:16:22 -- common/autotest_common.sh@10 -- # set +x 00:06:52.887 06:16:22 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.887 06:16:22 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:06:52.888 06:16:22 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:06:52.888 06:16:22 -- accel/accel_rpc.sh@42 -- # grep software 00:06:52.888 06:16:22 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.888 06:16:22 -- common/autotest_common.sh@10 -- # set +x 00:06:52.888 06:16:22 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.888 software 00:06:52.888 00:06:52.888 real 0m0.218s 00:06:52.888 user 0m0.030s 00:06:52.888 sys 0m0.008s 00:06:52.888 06:16:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:52.888 06:16:22 -- common/autotest_common.sh@10 -- # set +x 00:06:52.888 ************************************ 00:06:52.888 END TEST accel_assign_opcode 00:06:52.888 ************************************ 00:06:52.888 06:16:22 -- accel/accel_rpc.sh@55 -- # killprocess 27785 00:06:52.888 06:16:22 -- common/autotest_common.sh@936 -- # '[' -z 27785 ']' 00:06:52.888 06:16:22 -- common/autotest_common.sh@940 -- # kill -0 27785 00:06:52.888 06:16:22 -- common/autotest_common.sh@941 -- # uname 00:06:52.888 06:16:22 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:52.888 06:16:22 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 27785 00:06:52.888 06:16:22 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:52.888 06:16:22 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:52.888 06:16:22 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 27785' 00:06:52.888 killing process with pid 27785 00:06:52.888 06:16:22 -- common/autotest_common.sh@955 -- # kill 27785 00:06:52.888 06:16:22 -- common/autotest_common.sh@960 -- # wait 27785 00:06:53.147 00:06:53.147 real 0m1.672s 00:06:53.147 user 0m1.696s 00:06:53.147 sys 0m0.481s 00:06:53.147 06:16:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:53.147 06:16:22 -- common/autotest_common.sh@10 -- # set +x 00:06:53.147 ************************************ 00:06:53.147 END TEST accel_rpc 00:06:53.147 ************************************ 00:06:53.407 06:16:22 -- spdk/autotest.sh@178 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:06:53.407 06:16:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:53.407 06:16:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:53.407 06:16:22 -- common/autotest_common.sh@10 -- # set +x 00:06:53.407 ************************************ 00:06:53.407 START TEST app_cmdline 00:06:53.407 ************************************ 00:06:53.407 06:16:22 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:06:53.407 * Looking for test storage... 00:06:53.407 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:53.407 06:16:22 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:53.407 06:16:22 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:53.407 06:16:22 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:53.407 06:16:22 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:53.407 06:16:22 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:53.407 06:16:22 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:53.407 06:16:22 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:53.407 06:16:22 -- scripts/common.sh@335 -- # IFS=.-: 00:06:53.407 06:16:22 -- scripts/common.sh@335 -- # read -ra ver1 00:06:53.407 06:16:22 -- scripts/common.sh@336 -- # IFS=.-: 00:06:53.407 06:16:22 -- scripts/common.sh@336 -- # read -ra ver2 00:06:53.407 06:16:22 -- scripts/common.sh@337 -- # local 'op=<' 00:06:53.407 06:16:22 -- scripts/common.sh@339 -- # ver1_l=2 00:06:53.407 06:16:22 -- scripts/common.sh@340 -- # ver2_l=1 00:06:53.407 06:16:22 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:53.407 06:16:22 -- scripts/common.sh@343 -- # case "$op" in 00:06:53.407 06:16:22 -- scripts/common.sh@344 -- # : 1 00:06:53.407 06:16:22 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:53.407 06:16:22 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:53.407 06:16:22 -- scripts/common.sh@364 -- # decimal 1 00:06:53.407 06:16:22 -- scripts/common.sh@352 -- # local d=1 00:06:53.407 06:16:22 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:53.407 06:16:22 -- scripts/common.sh@354 -- # echo 1 00:06:53.407 06:16:22 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:53.407 06:16:22 -- scripts/common.sh@365 -- # decimal 2 00:06:53.407 06:16:22 -- scripts/common.sh@352 -- # local d=2 00:06:53.407 06:16:22 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:53.407 06:16:22 -- scripts/common.sh@354 -- # echo 2 00:06:53.407 06:16:22 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:53.407 06:16:22 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:53.407 06:16:22 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:53.407 06:16:22 -- scripts/common.sh@367 -- # return 0 00:06:53.407 06:16:22 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:53.407 06:16:22 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:53.407 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.407 --rc genhtml_branch_coverage=1 00:06:53.407 --rc genhtml_function_coverage=1 00:06:53.407 --rc genhtml_legend=1 00:06:53.407 --rc geninfo_all_blocks=1 00:06:53.407 --rc geninfo_unexecuted_blocks=1 00:06:53.407 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.407 ' 00:06:53.407 06:16:22 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:53.407 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.407 --rc genhtml_branch_coverage=1 00:06:53.407 --rc genhtml_function_coverage=1 00:06:53.407 --rc genhtml_legend=1 00:06:53.407 --rc geninfo_all_blocks=1 00:06:53.407 --rc geninfo_unexecuted_blocks=1 00:06:53.407 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.407 ' 00:06:53.407 06:16:22 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:53.407 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.407 --rc genhtml_branch_coverage=1 00:06:53.407 --rc genhtml_function_coverage=1 00:06:53.407 --rc genhtml_legend=1 00:06:53.407 --rc geninfo_all_blocks=1 00:06:53.407 --rc geninfo_unexecuted_blocks=1 00:06:53.407 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.407 ' 00:06:53.407 06:16:22 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:53.407 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.407 --rc genhtml_branch_coverage=1 00:06:53.407 --rc genhtml_function_coverage=1 00:06:53.407 --rc genhtml_legend=1 00:06:53.407 --rc geninfo_all_blocks=1 00:06:53.407 --rc geninfo_unexecuted_blocks=1 00:06:53.407 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.407 ' 00:06:53.407 06:16:22 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:06:53.407 06:16:22 -- app/cmdline.sh@17 -- # spdk_tgt_pid=28203 00:06:53.407 06:16:22 -- app/cmdline.sh@18 -- # waitforlisten 28203 00:06:53.407 06:16:22 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:06:53.407 06:16:22 -- common/autotest_common.sh@829 -- # '[' -z 28203 ']' 00:06:53.407 06:16:22 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:53.407 06:16:22 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:53.408 06:16:22 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:53.408 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:53.408 06:16:22 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:53.408 06:16:22 -- common/autotest_common.sh@10 -- # set +x 00:06:53.408 [2024-11-27 06:16:22.921751] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:53.408 [2024-11-27 06:16:22.921823] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid28203 ] 00:06:53.667 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.667 [2024-11-27 06:16:22.988735] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.667 [2024-11-27 06:16:23.062471] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:53.668 [2024-11-27 06:16:23.062579] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.237 06:16:23 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:54.237 06:16:23 -- common/autotest_common.sh@862 -- # return 0 00:06:54.237 06:16:23 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:06:54.497 { 00:06:54.497 "version": "SPDK v24.01.1-pre git sha1 c13c99a5e", 00:06:54.497 "fields": { 00:06:54.497 "major": 24, 00:06:54.497 "minor": 1, 00:06:54.497 "patch": 1, 00:06:54.497 "suffix": "-pre", 00:06:54.497 "commit": "c13c99a5e" 00:06:54.497 } 00:06:54.497 } 00:06:54.497 06:16:23 -- app/cmdline.sh@22 -- # expected_methods=() 00:06:54.497 06:16:23 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:06:54.497 06:16:23 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:06:54.497 06:16:23 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:06:54.497 06:16:23 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:06:54.497 06:16:23 -- app/cmdline.sh@26 -- # sort 00:06:54.497 06:16:23 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:06:54.497 06:16:23 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:54.497 06:16:23 -- common/autotest_common.sh@10 -- # set +x 00:06:54.497 06:16:23 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:54.497 06:16:23 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:06:54.497 06:16:23 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:06:54.497 06:16:23 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:54.497 06:16:23 -- common/autotest_common.sh@650 -- # local es=0 00:06:54.497 06:16:23 -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:54.497 06:16:23 -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:54.497 06:16:23 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:54.497 06:16:23 -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:54.497 06:16:23 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:54.497 06:16:23 -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:54.497 06:16:23 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:54.497 06:16:23 -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:54.497 06:16:23 -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:06:54.497 06:16:23 -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:54.757 request: 00:06:54.757 { 00:06:54.757 "method": "env_dpdk_get_mem_stats", 00:06:54.757 "req_id": 1 00:06:54.757 } 00:06:54.757 Got JSON-RPC error response 00:06:54.757 response: 00:06:54.757 { 00:06:54.757 "code": -32601, 00:06:54.757 "message": "Method not found" 00:06:54.757 } 00:06:54.757 06:16:24 -- common/autotest_common.sh@653 -- # es=1 00:06:54.757 06:16:24 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:54.757 06:16:24 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:54.757 06:16:24 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:54.757 06:16:24 -- app/cmdline.sh@1 -- # killprocess 28203 00:06:54.757 06:16:24 -- common/autotest_common.sh@936 -- # '[' -z 28203 ']' 00:06:54.757 06:16:24 -- common/autotest_common.sh@940 -- # kill -0 28203 00:06:54.757 06:16:24 -- common/autotest_common.sh@941 -- # uname 00:06:54.757 06:16:24 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:54.757 06:16:24 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 28203 00:06:54.757 06:16:24 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:54.757 06:16:24 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:54.757 06:16:24 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 28203' 00:06:54.757 killing process with pid 28203 00:06:54.757 06:16:24 -- common/autotest_common.sh@955 -- # kill 28203 00:06:54.757 06:16:24 -- common/autotest_common.sh@960 -- # wait 28203 00:06:55.016 00:06:55.016 real 0m1.776s 00:06:55.016 user 0m2.035s 00:06:55.016 sys 0m0.514s 00:06:55.016 06:16:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:55.016 06:16:24 -- common/autotest_common.sh@10 -- # set +x 00:06:55.016 ************************************ 00:06:55.016 END TEST app_cmdline 00:06:55.016 ************************************ 00:06:55.016 06:16:24 -- spdk/autotest.sh@179 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:06:55.016 06:16:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:55.016 06:16:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:55.016 06:16:24 -- common/autotest_common.sh@10 -- # set +x 00:06:55.016 ************************************ 00:06:55.016 START TEST version 00:06:55.016 ************************************ 00:06:55.016 06:16:24 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:06:55.276 * Looking for test storage... 00:06:55.276 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:55.276 06:16:24 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:55.276 06:16:24 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:55.276 06:16:24 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:55.276 06:16:24 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:55.276 06:16:24 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:55.276 06:16:24 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:55.276 06:16:24 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:55.276 06:16:24 -- scripts/common.sh@335 -- # IFS=.-: 00:06:55.276 06:16:24 -- scripts/common.sh@335 -- # read -ra ver1 00:06:55.276 06:16:24 -- scripts/common.sh@336 -- # IFS=.-: 00:06:55.276 06:16:24 -- scripts/common.sh@336 -- # read -ra ver2 00:06:55.276 06:16:24 -- scripts/common.sh@337 -- # local 'op=<' 00:06:55.276 06:16:24 -- scripts/common.sh@339 -- # ver1_l=2 00:06:55.276 06:16:24 -- scripts/common.sh@340 -- # ver2_l=1 00:06:55.276 06:16:24 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:55.276 06:16:24 -- scripts/common.sh@343 -- # case "$op" in 00:06:55.276 06:16:24 -- scripts/common.sh@344 -- # : 1 00:06:55.276 06:16:24 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:55.276 06:16:24 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:55.276 06:16:24 -- scripts/common.sh@364 -- # decimal 1 00:06:55.276 06:16:24 -- scripts/common.sh@352 -- # local d=1 00:06:55.276 06:16:24 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:55.276 06:16:24 -- scripts/common.sh@354 -- # echo 1 00:06:55.276 06:16:24 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:55.276 06:16:24 -- scripts/common.sh@365 -- # decimal 2 00:06:55.276 06:16:24 -- scripts/common.sh@352 -- # local d=2 00:06:55.276 06:16:24 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:55.276 06:16:24 -- scripts/common.sh@354 -- # echo 2 00:06:55.276 06:16:24 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:55.276 06:16:24 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:55.276 06:16:24 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:55.276 06:16:24 -- scripts/common.sh@367 -- # return 0 00:06:55.276 06:16:24 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:55.276 06:16:24 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:55.276 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.276 --rc genhtml_branch_coverage=1 00:06:55.276 --rc genhtml_function_coverage=1 00:06:55.276 --rc genhtml_legend=1 00:06:55.276 --rc geninfo_all_blocks=1 00:06:55.276 --rc geninfo_unexecuted_blocks=1 00:06:55.276 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.276 ' 00:06:55.276 06:16:24 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:55.276 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.276 --rc genhtml_branch_coverage=1 00:06:55.276 --rc genhtml_function_coverage=1 00:06:55.276 --rc genhtml_legend=1 00:06:55.276 --rc geninfo_all_blocks=1 00:06:55.276 --rc geninfo_unexecuted_blocks=1 00:06:55.276 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.276 ' 00:06:55.276 06:16:24 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:55.276 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.276 --rc genhtml_branch_coverage=1 00:06:55.276 --rc genhtml_function_coverage=1 00:06:55.276 --rc genhtml_legend=1 00:06:55.276 --rc geninfo_all_blocks=1 00:06:55.276 --rc geninfo_unexecuted_blocks=1 00:06:55.276 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.276 ' 00:06:55.276 06:16:24 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:55.276 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.276 --rc genhtml_branch_coverage=1 00:06:55.276 --rc genhtml_function_coverage=1 00:06:55.276 --rc genhtml_legend=1 00:06:55.276 --rc geninfo_all_blocks=1 00:06:55.276 --rc geninfo_unexecuted_blocks=1 00:06:55.276 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.276 ' 00:06:55.276 06:16:24 -- app/version.sh@17 -- # get_header_version major 00:06:55.276 06:16:24 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:55.276 06:16:24 -- app/version.sh@14 -- # cut -f2 00:06:55.276 06:16:24 -- app/version.sh@14 -- # tr -d '"' 00:06:55.276 06:16:24 -- app/version.sh@17 -- # major=24 00:06:55.276 06:16:24 -- app/version.sh@18 -- # get_header_version minor 00:06:55.276 06:16:24 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:55.276 06:16:24 -- app/version.sh@14 -- # cut -f2 00:06:55.276 06:16:24 -- app/version.sh@14 -- # tr -d '"' 00:06:55.276 06:16:24 -- app/version.sh@18 -- # minor=1 00:06:55.276 06:16:24 -- app/version.sh@19 -- # get_header_version patch 00:06:55.276 06:16:24 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:55.276 06:16:24 -- app/version.sh@14 -- # cut -f2 00:06:55.276 06:16:24 -- app/version.sh@14 -- # tr -d '"' 00:06:55.276 06:16:24 -- app/version.sh@19 -- # patch=1 00:06:55.276 06:16:24 -- app/version.sh@20 -- # get_header_version suffix 00:06:55.276 06:16:24 -- app/version.sh@14 -- # tr -d '"' 00:06:55.276 06:16:24 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:55.276 06:16:24 -- app/version.sh@14 -- # cut -f2 00:06:55.276 06:16:24 -- app/version.sh@20 -- # suffix=-pre 00:06:55.276 06:16:24 -- app/version.sh@22 -- # version=24.1 00:06:55.276 06:16:24 -- app/version.sh@25 -- # (( patch != 0 )) 00:06:55.276 06:16:24 -- app/version.sh@25 -- # version=24.1.1 00:06:55.276 06:16:24 -- app/version.sh@28 -- # version=24.1.1rc0 00:06:55.276 06:16:24 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:55.276 06:16:24 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:06:55.276 06:16:24 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:06:55.276 06:16:24 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:06:55.276 00:06:55.276 real 0m0.267s 00:06:55.276 user 0m0.144s 00:06:55.276 sys 0m0.174s 00:06:55.276 06:16:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:55.276 06:16:24 -- common/autotest_common.sh@10 -- # set +x 00:06:55.276 ************************************ 00:06:55.276 END TEST version 00:06:55.276 ************************************ 00:06:55.536 06:16:24 -- spdk/autotest.sh@181 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@191 -- # uname -s 00:06:55.536 06:16:24 -- spdk/autotest.sh@191 -- # [[ Linux == Linux ]] 00:06:55.536 06:16:24 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:06:55.536 06:16:24 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:06:55.536 06:16:24 -- spdk/autotest.sh@204 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@251 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@255 -- # timing_exit lib 00:06:55.536 06:16:24 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:55.536 06:16:24 -- common/autotest_common.sh@10 -- # set +x 00:06:55.536 06:16:24 -- spdk/autotest.sh@257 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@265 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@298 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@302 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:06:55.536 06:16:24 -- spdk/autotest.sh@353 -- # [[ 0 -eq 1 ]] 00:06:55.536 06:16:24 -- spdk/autotest.sh@357 -- # [[ 0 -eq 1 ]] 00:06:55.536 06:16:24 -- spdk/autotest.sh@361 -- # [[ 1 -eq 1 ]] 00:06:55.536 06:16:24 -- spdk/autotest.sh@362 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:06:55.536 06:16:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:55.536 06:16:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:55.536 06:16:24 -- common/autotest_common.sh@10 -- # set +x 00:06:55.536 ************************************ 00:06:55.536 START TEST llvm_fuzz 00:06:55.536 ************************************ 00:06:55.536 06:16:24 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:06:55.536 * Looking for test storage... 00:06:55.536 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:06:55.536 06:16:25 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:55.536 06:16:25 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:55.536 06:16:25 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:55.797 06:16:25 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:55.797 06:16:25 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:55.797 06:16:25 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:55.797 06:16:25 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:55.797 06:16:25 -- scripts/common.sh@335 -- # IFS=.-: 00:06:55.797 06:16:25 -- scripts/common.sh@335 -- # read -ra ver1 00:06:55.797 06:16:25 -- scripts/common.sh@336 -- # IFS=.-: 00:06:55.797 06:16:25 -- scripts/common.sh@336 -- # read -ra ver2 00:06:55.797 06:16:25 -- scripts/common.sh@337 -- # local 'op=<' 00:06:55.797 06:16:25 -- scripts/common.sh@339 -- # ver1_l=2 00:06:55.797 06:16:25 -- scripts/common.sh@340 -- # ver2_l=1 00:06:55.797 06:16:25 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:55.797 06:16:25 -- scripts/common.sh@343 -- # case "$op" in 00:06:55.797 06:16:25 -- scripts/common.sh@344 -- # : 1 00:06:55.797 06:16:25 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:55.797 06:16:25 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:55.797 06:16:25 -- scripts/common.sh@364 -- # decimal 1 00:06:55.797 06:16:25 -- scripts/common.sh@352 -- # local d=1 00:06:55.797 06:16:25 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:55.797 06:16:25 -- scripts/common.sh@354 -- # echo 1 00:06:55.797 06:16:25 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:55.797 06:16:25 -- scripts/common.sh@365 -- # decimal 2 00:06:55.797 06:16:25 -- scripts/common.sh@352 -- # local d=2 00:06:55.797 06:16:25 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:55.797 06:16:25 -- scripts/common.sh@354 -- # echo 2 00:06:55.797 06:16:25 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:55.797 06:16:25 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:55.797 06:16:25 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:55.797 06:16:25 -- scripts/common.sh@367 -- # return 0 00:06:55.797 06:16:25 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:55.797 06:16:25 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:55.797 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.797 --rc genhtml_branch_coverage=1 00:06:55.797 --rc genhtml_function_coverage=1 00:06:55.797 --rc genhtml_legend=1 00:06:55.797 --rc geninfo_all_blocks=1 00:06:55.797 --rc geninfo_unexecuted_blocks=1 00:06:55.797 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.797 ' 00:06:55.797 06:16:25 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:55.797 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.797 --rc genhtml_branch_coverage=1 00:06:55.797 --rc genhtml_function_coverage=1 00:06:55.797 --rc genhtml_legend=1 00:06:55.797 --rc geninfo_all_blocks=1 00:06:55.797 --rc geninfo_unexecuted_blocks=1 00:06:55.797 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.797 ' 00:06:55.797 06:16:25 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:55.797 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.797 --rc genhtml_branch_coverage=1 00:06:55.797 --rc genhtml_function_coverage=1 00:06:55.797 --rc genhtml_legend=1 00:06:55.797 --rc geninfo_all_blocks=1 00:06:55.797 --rc geninfo_unexecuted_blocks=1 00:06:55.797 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.797 ' 00:06:55.797 06:16:25 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:55.797 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.797 --rc genhtml_branch_coverage=1 00:06:55.797 --rc genhtml_function_coverage=1 00:06:55.797 --rc genhtml_legend=1 00:06:55.797 --rc geninfo_all_blocks=1 00:06:55.797 --rc geninfo_unexecuted_blocks=1 00:06:55.797 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.797 ' 00:06:55.797 06:16:25 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:06:55.798 06:16:25 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:06:55.798 06:16:25 -- common/autotest_common.sh@548 -- # fuzzers=() 00:06:55.798 06:16:25 -- common/autotest_common.sh@548 -- # local fuzzers 00:06:55.798 06:16:25 -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:06:55.798 06:16:25 -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:06:55.798 06:16:25 -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:06:55.798 06:16:25 -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:06:55.798 06:16:25 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:06:55.798 06:16:25 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:06:55.798 06:16:25 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:06:55.798 06:16:25 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:06:55.798 06:16:25 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:06:55.798 06:16:25 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:06:55.798 06:16:25 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:06:55.798 06:16:25 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:06:55.798 06:16:25 -- fuzz/llvm.sh@19 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:06:55.798 06:16:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:55.798 06:16:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:55.798 06:16:25 -- common/autotest_common.sh@10 -- # set +x 00:06:55.798 ************************************ 00:06:55.798 START TEST nvmf_fuzz 00:06:55.798 ************************************ 00:06:55.798 06:16:25 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:06:55.798 * Looking for test storage... 00:06:55.798 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:55.798 06:16:25 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:55.798 06:16:25 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:55.798 06:16:25 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:55.798 06:16:25 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:55.798 06:16:25 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:55.798 06:16:25 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:55.798 06:16:25 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:55.798 06:16:25 -- scripts/common.sh@335 -- # IFS=.-: 00:06:55.798 06:16:25 -- scripts/common.sh@335 -- # read -ra ver1 00:06:55.798 06:16:25 -- scripts/common.sh@336 -- # IFS=.-: 00:06:55.798 06:16:25 -- scripts/common.sh@336 -- # read -ra ver2 00:06:55.798 06:16:25 -- scripts/common.sh@337 -- # local 'op=<' 00:06:55.798 06:16:25 -- scripts/common.sh@339 -- # ver1_l=2 00:06:55.798 06:16:25 -- scripts/common.sh@340 -- # ver2_l=1 00:06:55.798 06:16:25 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:55.798 06:16:25 -- scripts/common.sh@343 -- # case "$op" in 00:06:55.798 06:16:25 -- scripts/common.sh@344 -- # : 1 00:06:55.798 06:16:25 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:55.798 06:16:25 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:55.798 06:16:25 -- scripts/common.sh@364 -- # decimal 1 00:06:55.798 06:16:25 -- scripts/common.sh@352 -- # local d=1 00:06:55.798 06:16:25 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:55.798 06:16:25 -- scripts/common.sh@354 -- # echo 1 00:06:55.798 06:16:25 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:55.798 06:16:25 -- scripts/common.sh@365 -- # decimal 2 00:06:55.798 06:16:25 -- scripts/common.sh@352 -- # local d=2 00:06:55.798 06:16:25 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:55.798 06:16:25 -- scripts/common.sh@354 -- # echo 2 00:06:55.798 06:16:25 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:55.798 06:16:25 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:55.798 06:16:25 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:55.798 06:16:25 -- scripts/common.sh@367 -- # return 0 00:06:55.798 06:16:25 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:55.798 06:16:25 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:55.798 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.798 --rc genhtml_branch_coverage=1 00:06:55.798 --rc genhtml_function_coverage=1 00:06:55.798 --rc genhtml_legend=1 00:06:55.798 --rc geninfo_all_blocks=1 00:06:55.798 --rc geninfo_unexecuted_blocks=1 00:06:55.798 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.798 ' 00:06:55.798 06:16:25 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:55.798 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.798 --rc genhtml_branch_coverage=1 00:06:55.798 --rc genhtml_function_coverage=1 00:06:55.798 --rc genhtml_legend=1 00:06:55.798 --rc geninfo_all_blocks=1 00:06:55.798 --rc geninfo_unexecuted_blocks=1 00:06:55.798 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.798 ' 00:06:55.798 06:16:25 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:55.798 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.798 --rc genhtml_branch_coverage=1 00:06:55.798 --rc genhtml_function_coverage=1 00:06:55.798 --rc genhtml_legend=1 00:06:55.798 --rc geninfo_all_blocks=1 00:06:55.798 --rc geninfo_unexecuted_blocks=1 00:06:55.798 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.798 ' 00:06:55.798 06:16:25 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:55.798 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.798 --rc genhtml_branch_coverage=1 00:06:55.798 --rc genhtml_function_coverage=1 00:06:55.798 --rc genhtml_legend=1 00:06:55.798 --rc geninfo_all_blocks=1 00:06:55.798 --rc geninfo_unexecuted_blocks=1 00:06:55.798 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.798 ' 00:06:55.798 06:16:25 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:06:55.798 06:16:25 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:06:55.798 06:16:25 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:06:55.798 06:16:25 -- common/autotest_common.sh@34 -- # set -e 00:06:55.798 06:16:25 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:06:55.798 06:16:25 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:06:55.798 06:16:25 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:06:55.798 06:16:25 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:06:55.798 06:16:25 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:06:55.798 06:16:25 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:06:55.798 06:16:25 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:06:55.798 06:16:25 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:06:55.798 06:16:25 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:06:55.798 06:16:25 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:06:55.798 06:16:25 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:06:55.798 06:16:25 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:06:55.798 06:16:25 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:06:55.798 06:16:25 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:06:55.798 06:16:25 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:06:55.798 06:16:25 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:06:55.798 06:16:25 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:06:55.798 06:16:25 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:06:55.798 06:16:25 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:06:55.798 06:16:25 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:06:55.798 06:16:25 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:06:55.798 06:16:25 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:06:55.798 06:16:25 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:06:55.798 06:16:25 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:06:55.798 06:16:25 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:06:55.798 06:16:25 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:06:55.798 06:16:25 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:06:55.798 06:16:25 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:06:55.798 06:16:25 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:06:55.798 06:16:25 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:06:55.798 06:16:25 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:06:55.798 06:16:25 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:06:55.798 06:16:25 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:06:55.798 06:16:25 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:06:55.798 06:16:25 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:06:55.798 06:16:25 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:06:55.798 06:16:25 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:06:55.798 06:16:25 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:06:55.798 06:16:25 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:06:55.798 06:16:25 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:06:55.798 06:16:25 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:06:55.798 06:16:25 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:06:55.798 06:16:25 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:06:55.798 06:16:25 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:06:55.798 06:16:25 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:06:55.798 06:16:25 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:06:55.798 06:16:25 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:06:55.798 06:16:25 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:06:55.798 06:16:25 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:06:55.798 06:16:25 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:06:55.798 06:16:25 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:06:55.798 06:16:25 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:06:55.798 06:16:25 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:06:55.798 06:16:25 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:06:55.798 06:16:25 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:06:55.798 06:16:25 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:06:55.798 06:16:25 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:06:55.798 06:16:25 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:06:55.798 06:16:25 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:06:55.798 06:16:25 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:06:55.799 06:16:25 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:06:55.799 06:16:25 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:06:55.799 06:16:25 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:06:55.799 06:16:25 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:06:55.799 06:16:25 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:06:55.799 06:16:25 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:06:55.799 06:16:25 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:06:55.799 06:16:25 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:06:55.799 06:16:25 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:06:55.799 06:16:25 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:06:55.799 06:16:25 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:06:55.799 06:16:25 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:06:55.799 06:16:25 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:06:55.799 06:16:25 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:06:55.799 06:16:25 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:06:55.799 06:16:25 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:06:55.799 06:16:25 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:06:55.799 06:16:25 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:06:55.799 06:16:25 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:06:55.799 06:16:25 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:06:55.799 06:16:25 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:06:55.799 06:16:25 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:06:55.799 06:16:25 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:06:55.799 06:16:25 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:06:55.799 06:16:25 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:06:55.799 06:16:25 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:06:55.799 06:16:25 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:06:55.799 06:16:25 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:06:55.799 06:16:25 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:55.799 06:16:25 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:55.799 06:16:25 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:55.799 06:16:25 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:06:55.799 06:16:25 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:06:55.799 06:16:25 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:06:55.799 06:16:25 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:06:55.799 06:16:25 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:06:55.799 06:16:25 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:06:55.799 06:16:25 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:06:55.799 06:16:25 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:06:55.799 #define SPDK_CONFIG_H 00:06:55.799 #define SPDK_CONFIG_APPS 1 00:06:55.799 #define SPDK_CONFIG_ARCH native 00:06:55.799 #undef SPDK_CONFIG_ASAN 00:06:55.799 #undef SPDK_CONFIG_AVAHI 00:06:55.799 #undef SPDK_CONFIG_CET 00:06:55.799 #define SPDK_CONFIG_COVERAGE 1 00:06:55.799 #define SPDK_CONFIG_CROSS_PREFIX 00:06:55.799 #undef SPDK_CONFIG_CRYPTO 00:06:55.799 #undef SPDK_CONFIG_CRYPTO_MLX5 00:06:55.799 #undef SPDK_CONFIG_CUSTOMOCF 00:06:55.799 #undef SPDK_CONFIG_DAOS 00:06:55.799 #define SPDK_CONFIG_DAOS_DIR 00:06:55.799 #define SPDK_CONFIG_DEBUG 1 00:06:55.799 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:06:55.799 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:06:55.799 #define SPDK_CONFIG_DPDK_INC_DIR 00:06:55.799 #define SPDK_CONFIG_DPDK_LIB_DIR 00:06:55.799 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:06:55.799 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:06:55.799 #define SPDK_CONFIG_EXAMPLES 1 00:06:55.799 #undef SPDK_CONFIG_FC 00:06:55.799 #define SPDK_CONFIG_FC_PATH 00:06:55.799 #define SPDK_CONFIG_FIO_PLUGIN 1 00:06:55.799 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:06:55.799 #undef SPDK_CONFIG_FUSE 00:06:55.799 #define SPDK_CONFIG_FUZZER 1 00:06:55.799 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:06:55.799 #undef SPDK_CONFIG_GOLANG 00:06:55.799 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:06:55.799 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:06:55.799 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:06:55.799 #undef SPDK_CONFIG_HAVE_LIBBSD 00:06:55.799 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:06:55.799 #define SPDK_CONFIG_IDXD 1 00:06:55.799 #define SPDK_CONFIG_IDXD_KERNEL 1 00:06:55.799 #undef SPDK_CONFIG_IPSEC_MB 00:06:55.799 #define SPDK_CONFIG_IPSEC_MB_DIR 00:06:55.799 #define SPDK_CONFIG_ISAL 1 00:06:55.799 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:06:55.799 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:06:55.799 #define SPDK_CONFIG_LIBDIR 00:06:55.799 #undef SPDK_CONFIG_LTO 00:06:55.799 #define SPDK_CONFIG_MAX_LCORES 00:06:55.799 #define SPDK_CONFIG_NVME_CUSE 1 00:06:55.799 #undef SPDK_CONFIG_OCF 00:06:55.799 #define SPDK_CONFIG_OCF_PATH 00:06:55.799 #define SPDK_CONFIG_OPENSSL_PATH 00:06:55.799 #undef SPDK_CONFIG_PGO_CAPTURE 00:06:55.799 #undef SPDK_CONFIG_PGO_USE 00:06:55.799 #define SPDK_CONFIG_PREFIX /usr/local 00:06:55.799 #undef SPDK_CONFIG_RAID5F 00:06:55.799 #undef SPDK_CONFIG_RBD 00:06:55.799 #define SPDK_CONFIG_RDMA 1 00:06:55.799 #define SPDK_CONFIG_RDMA_PROV verbs 00:06:55.799 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:06:55.799 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:06:55.799 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:06:55.799 #undef SPDK_CONFIG_SHARED 00:06:55.799 #undef SPDK_CONFIG_SMA 00:06:55.799 #define SPDK_CONFIG_TESTS 1 00:06:55.799 #undef SPDK_CONFIG_TSAN 00:06:55.799 #define SPDK_CONFIG_UBLK 1 00:06:55.799 #define SPDK_CONFIG_UBSAN 1 00:06:55.799 #undef SPDK_CONFIG_UNIT_TESTS 00:06:55.799 #undef SPDK_CONFIG_URING 00:06:55.799 #define SPDK_CONFIG_URING_PATH 00:06:55.799 #undef SPDK_CONFIG_URING_ZNS 00:06:55.799 #undef SPDK_CONFIG_USDT 00:06:55.799 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:06:55.799 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:06:55.799 #define SPDK_CONFIG_VFIO_USER 1 00:06:55.799 #define SPDK_CONFIG_VFIO_USER_DIR 00:06:55.799 #define SPDK_CONFIG_VHOST 1 00:06:55.799 #define SPDK_CONFIG_VIRTIO 1 00:06:55.799 #undef SPDK_CONFIG_VTUNE 00:06:55.799 #define SPDK_CONFIG_VTUNE_DIR 00:06:55.799 #define SPDK_CONFIG_WERROR 1 00:06:55.799 #define SPDK_CONFIG_WPDK_DIR 00:06:55.799 #undef SPDK_CONFIG_XNVME 00:06:55.799 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:06:55.799 06:16:25 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:06:55.799 06:16:25 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:55.799 06:16:25 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:55.799 06:16:25 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:55.799 06:16:25 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:55.799 06:16:25 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:55.799 06:16:25 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:55.799 06:16:25 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:55.799 06:16:25 -- paths/export.sh@5 -- # export PATH 00:06:55.799 06:16:25 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:55.799 06:16:25 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:06:55.799 06:16:25 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:06:55.799 06:16:25 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:06:55.799 06:16:25 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:06:55.799 06:16:25 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:06:56.061 06:16:25 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:06:56.061 06:16:25 -- pm/common@16 -- # TEST_TAG=N/A 00:06:56.061 06:16:25 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:06:56.061 06:16:25 -- common/autotest_common.sh@52 -- # : 1 00:06:56.061 06:16:25 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:06:56.061 06:16:25 -- common/autotest_common.sh@56 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:06:56.061 06:16:25 -- common/autotest_common.sh@58 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:06:56.061 06:16:25 -- common/autotest_common.sh@60 -- # : 1 00:06:56.061 06:16:25 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:06:56.061 06:16:25 -- common/autotest_common.sh@62 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:06:56.061 06:16:25 -- common/autotest_common.sh@64 -- # : 00:06:56.061 06:16:25 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:06:56.061 06:16:25 -- common/autotest_common.sh@66 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:06:56.061 06:16:25 -- common/autotest_common.sh@68 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:06:56.061 06:16:25 -- common/autotest_common.sh@70 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:06:56.061 06:16:25 -- common/autotest_common.sh@72 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:06:56.061 06:16:25 -- common/autotest_common.sh@74 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:06:56.061 06:16:25 -- common/autotest_common.sh@76 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:06:56.061 06:16:25 -- common/autotest_common.sh@78 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:06:56.061 06:16:25 -- common/autotest_common.sh@80 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:06:56.061 06:16:25 -- common/autotest_common.sh@82 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:06:56.061 06:16:25 -- common/autotest_common.sh@84 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:06:56.061 06:16:25 -- common/autotest_common.sh@86 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:06:56.061 06:16:25 -- common/autotest_common.sh@88 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:06:56.061 06:16:25 -- common/autotest_common.sh@90 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:06:56.061 06:16:25 -- common/autotest_common.sh@92 -- # : 1 00:06:56.061 06:16:25 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:06:56.061 06:16:25 -- common/autotest_common.sh@94 -- # : 1 00:06:56.061 06:16:25 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:06:56.061 06:16:25 -- common/autotest_common.sh@96 -- # : rdma 00:06:56.061 06:16:25 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:06:56.061 06:16:25 -- common/autotest_common.sh@98 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:06:56.061 06:16:25 -- common/autotest_common.sh@100 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:06:56.061 06:16:25 -- common/autotest_common.sh@102 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:06:56.061 06:16:25 -- common/autotest_common.sh@104 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:06:56.061 06:16:25 -- common/autotest_common.sh@106 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:06:56.061 06:16:25 -- common/autotest_common.sh@108 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:06:56.061 06:16:25 -- common/autotest_common.sh@110 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:06:56.061 06:16:25 -- common/autotest_common.sh@112 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:06:56.061 06:16:25 -- common/autotest_common.sh@114 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:06:56.061 06:16:25 -- common/autotest_common.sh@116 -- # : 1 00:06:56.061 06:16:25 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:06:56.061 06:16:25 -- common/autotest_common.sh@118 -- # : 00:06:56.061 06:16:25 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:06:56.061 06:16:25 -- common/autotest_common.sh@120 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:06:56.061 06:16:25 -- common/autotest_common.sh@122 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:06:56.061 06:16:25 -- common/autotest_common.sh@124 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:06:56.061 06:16:25 -- common/autotest_common.sh@126 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:06:56.061 06:16:25 -- common/autotest_common.sh@128 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:06:56.061 06:16:25 -- common/autotest_common.sh@130 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:06:56.061 06:16:25 -- common/autotest_common.sh@132 -- # : 00:06:56.061 06:16:25 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:06:56.061 06:16:25 -- common/autotest_common.sh@134 -- # : true 00:06:56.061 06:16:25 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:06:56.061 06:16:25 -- common/autotest_common.sh@136 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:06:56.061 06:16:25 -- common/autotest_common.sh@138 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:06:56.061 06:16:25 -- common/autotest_common.sh@140 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:06:56.061 06:16:25 -- common/autotest_common.sh@142 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:06:56.061 06:16:25 -- common/autotest_common.sh@144 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:06:56.061 06:16:25 -- common/autotest_common.sh@146 -- # : 0 00:06:56.061 06:16:25 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:06:56.062 06:16:25 -- common/autotest_common.sh@148 -- # : 00:06:56.062 06:16:25 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:06:56.062 06:16:25 -- common/autotest_common.sh@150 -- # : 0 00:06:56.062 06:16:25 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:06:56.062 06:16:25 -- common/autotest_common.sh@152 -- # : 0 00:06:56.062 06:16:25 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:06:56.062 06:16:25 -- common/autotest_common.sh@154 -- # : 0 00:06:56.062 06:16:25 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:06:56.062 06:16:25 -- common/autotest_common.sh@156 -- # : 0 00:06:56.062 06:16:25 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:06:56.062 06:16:25 -- common/autotest_common.sh@158 -- # : 0 00:06:56.062 06:16:25 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:06:56.062 06:16:25 -- common/autotest_common.sh@160 -- # : 0 00:06:56.062 06:16:25 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:06:56.062 06:16:25 -- common/autotest_common.sh@163 -- # : 00:06:56.062 06:16:25 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:06:56.062 06:16:25 -- common/autotest_common.sh@165 -- # : 0 00:06:56.062 06:16:25 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:06:56.062 06:16:25 -- common/autotest_common.sh@167 -- # : 0 00:06:56.062 06:16:25 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:06:56.062 06:16:25 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:06:56.062 06:16:25 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:06:56.062 06:16:25 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:06:56.062 06:16:25 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:06:56.062 06:16:25 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:56.062 06:16:25 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:56.062 06:16:25 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:56.062 06:16:25 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:56.062 06:16:25 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:06:56.062 06:16:25 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:06:56.062 06:16:25 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:56.062 06:16:25 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:56.062 06:16:25 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:06:56.062 06:16:25 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:06:56.062 06:16:25 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:06:56.062 06:16:25 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:06:56.062 06:16:25 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:06:56.062 06:16:25 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:06:56.062 06:16:25 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:06:56.062 06:16:25 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:06:56.062 06:16:25 -- common/autotest_common.sh@196 -- # cat 00:06:56.062 06:16:25 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:06:56.062 06:16:25 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:06:56.062 06:16:25 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:06:56.062 06:16:25 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:06:56.062 06:16:25 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:06:56.062 06:16:25 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:06:56.062 06:16:25 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:06:56.062 06:16:25 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:56.062 06:16:25 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:56.062 06:16:25 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:56.062 06:16:25 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:56.062 06:16:25 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:06:56.062 06:16:25 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:06:56.062 06:16:25 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:06:56.062 06:16:25 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:06:56.062 06:16:25 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:06:56.062 06:16:25 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:06:56.062 06:16:25 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:06:56.062 06:16:25 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:06:56.062 06:16:25 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:06:56.062 06:16:25 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:06:56.062 06:16:25 -- common/autotest_common.sh@249 -- # _LCOV= 00:06:56.062 06:16:25 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:06:56.062 06:16:25 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:06:56.062 06:16:25 -- common/autotest_common.sh@250 -- # _LCOV=1 00:06:56.062 06:16:25 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:06:56.062 06:16:25 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:06:56.062 06:16:25 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:06:56.062 06:16:25 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:06:56.062 06:16:25 -- common/autotest_common.sh@259 -- # export valgrind= 00:06:56.062 06:16:25 -- common/autotest_common.sh@259 -- # valgrind= 00:06:56.062 06:16:25 -- common/autotest_common.sh@265 -- # uname -s 00:06:56.062 06:16:25 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:06:56.062 06:16:25 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:06:56.062 06:16:25 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:06:56.062 06:16:25 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:06:56.062 06:16:25 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:06:56.062 06:16:25 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:06:56.062 06:16:25 -- common/autotest_common.sh@275 -- # MAKE=make 00:06:56.062 06:16:25 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:06:56.062 06:16:25 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:06:56.062 06:16:25 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:06:56.062 06:16:25 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:06:56.062 06:16:25 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:06:56.062 06:16:25 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:06:56.062 06:16:25 -- common/autotest_common.sh@319 -- # [[ -z 28802 ]] 00:06:56.062 06:16:25 -- common/autotest_common.sh@319 -- # kill -0 28802 00:06:56.062 06:16:25 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:06:56.062 06:16:25 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:06:56.062 06:16:25 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:06:56.062 06:16:25 -- common/autotest_common.sh@332 -- # local mount target_dir 00:06:56.062 06:16:25 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:06:56.062 06:16:25 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:06:56.062 06:16:25 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:06:56.062 06:16:25 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:06:56.062 06:16:25 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.mxztOf 00:06:56.062 06:16:25 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:06:56.062 06:16:25 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:06:56.062 06:16:25 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:06:56.062 06:16:25 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.mxztOf/tests/nvmf /tmp/spdk.mxztOf 00:06:56.062 06:16:25 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:06:56.062 06:16:25 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:56.062 06:16:25 -- common/autotest_common.sh@328 -- # df -T 00:06:56.062 06:16:25 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:06:56.063 06:16:25 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:06:56.063 06:16:25 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:06:56.063 06:16:25 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:06:56.063 06:16:25 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # avails["$mount"]=53305544704 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:06:56.063 06:16:25 -- common/autotest_common.sh@364 -- # uses["$mount"]=8425062400 00:06:56.063 06:16:25 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # avails["$mount"]=30862708736 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:06:56.063 06:16:25 -- common/autotest_common.sh@364 -- # uses["$mount"]=2592768 00:06:56.063 06:16:25 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340129792 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:06:56.063 06:16:25 -- common/autotest_common.sh@364 -- # uses["$mount"]=5992448 00:06:56.063 06:16:25 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # avails["$mount"]=30863708160 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:06:56.063 06:16:25 -- common/autotest_common.sh@364 -- # uses["$mount"]=1597440 00:06:56.063 06:16:25 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:06:56.063 06:16:25 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:06:56.063 06:16:25 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:06:56.063 06:16:25 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:06:56.063 06:16:25 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:56.063 06:16:25 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:06:56.063 * Looking for test storage... 00:06:56.063 06:16:25 -- common/autotest_common.sh@369 -- # local target_space new_size 00:06:56.063 06:16:25 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:06:56.063 06:16:25 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:56.063 06:16:25 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:06:56.063 06:16:25 -- common/autotest_common.sh@373 -- # mount=/ 00:06:56.063 06:16:25 -- common/autotest_common.sh@375 -- # target_space=53305544704 00:06:56.063 06:16:25 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:06:56.063 06:16:25 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:06:56.063 06:16:25 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:06:56.063 06:16:25 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:06:56.063 06:16:25 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:06:56.063 06:16:25 -- common/autotest_common.sh@382 -- # new_size=10639654912 00:06:56.063 06:16:25 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:06:56.063 06:16:25 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:56.063 06:16:25 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:56.063 06:16:25 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:56.063 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:56.063 06:16:25 -- common/autotest_common.sh@390 -- # return 0 00:06:56.063 06:16:25 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:06:56.063 06:16:25 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:06:56.063 06:16:25 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:06:56.063 06:16:25 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:06:56.063 06:16:25 -- common/autotest_common.sh@1682 -- # true 00:06:56.063 06:16:25 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:06:56.063 06:16:25 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:06:56.063 06:16:25 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:06:56.063 06:16:25 -- common/autotest_common.sh@27 -- # exec 00:06:56.063 06:16:25 -- common/autotest_common.sh@29 -- # exec 00:06:56.063 06:16:25 -- common/autotest_common.sh@31 -- # xtrace_restore 00:06:56.063 06:16:25 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:06:56.063 06:16:25 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:06:56.063 06:16:25 -- common/autotest_common.sh@18 -- # set -x 00:06:56.063 06:16:25 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:56.063 06:16:25 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:56.063 06:16:25 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:56.063 06:16:25 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:56.063 06:16:25 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:56.063 06:16:25 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:56.063 06:16:25 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:56.063 06:16:25 -- scripts/common.sh@335 -- # IFS=.-: 00:06:56.063 06:16:25 -- scripts/common.sh@335 -- # read -ra ver1 00:06:56.063 06:16:25 -- scripts/common.sh@336 -- # IFS=.-: 00:06:56.063 06:16:25 -- scripts/common.sh@336 -- # read -ra ver2 00:06:56.063 06:16:25 -- scripts/common.sh@337 -- # local 'op=<' 00:06:56.063 06:16:25 -- scripts/common.sh@339 -- # ver1_l=2 00:06:56.063 06:16:25 -- scripts/common.sh@340 -- # ver2_l=1 00:06:56.063 06:16:25 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:56.063 06:16:25 -- scripts/common.sh@343 -- # case "$op" in 00:06:56.063 06:16:25 -- scripts/common.sh@344 -- # : 1 00:06:56.063 06:16:25 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:56.063 06:16:25 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:56.063 06:16:25 -- scripts/common.sh@364 -- # decimal 1 00:06:56.063 06:16:25 -- scripts/common.sh@352 -- # local d=1 00:06:56.063 06:16:25 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:56.063 06:16:25 -- scripts/common.sh@354 -- # echo 1 00:06:56.063 06:16:25 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:56.063 06:16:25 -- scripts/common.sh@365 -- # decimal 2 00:06:56.063 06:16:25 -- scripts/common.sh@352 -- # local d=2 00:06:56.063 06:16:25 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:56.063 06:16:25 -- scripts/common.sh@354 -- # echo 2 00:06:56.063 06:16:25 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:56.063 06:16:25 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:56.063 06:16:25 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:56.063 06:16:25 -- scripts/common.sh@367 -- # return 0 00:06:56.063 06:16:25 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:56.063 06:16:25 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:56.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.063 --rc genhtml_branch_coverage=1 00:06:56.063 --rc genhtml_function_coverage=1 00:06:56.063 --rc genhtml_legend=1 00:06:56.063 --rc geninfo_all_blocks=1 00:06:56.063 --rc geninfo_unexecuted_blocks=1 00:06:56.063 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.063 ' 00:06:56.063 06:16:25 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:56.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.063 --rc genhtml_branch_coverage=1 00:06:56.063 --rc genhtml_function_coverage=1 00:06:56.063 --rc genhtml_legend=1 00:06:56.063 --rc geninfo_all_blocks=1 00:06:56.063 --rc geninfo_unexecuted_blocks=1 00:06:56.063 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.063 ' 00:06:56.063 06:16:25 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:56.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.063 --rc genhtml_branch_coverage=1 00:06:56.063 --rc genhtml_function_coverage=1 00:06:56.063 --rc genhtml_legend=1 00:06:56.063 --rc geninfo_all_blocks=1 00:06:56.063 --rc geninfo_unexecuted_blocks=1 00:06:56.063 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.063 ' 00:06:56.063 06:16:25 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:56.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.063 --rc genhtml_branch_coverage=1 00:06:56.063 --rc genhtml_function_coverage=1 00:06:56.063 --rc genhtml_legend=1 00:06:56.063 --rc geninfo_all_blocks=1 00:06:56.063 --rc geninfo_unexecuted_blocks=1 00:06:56.063 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.063 ' 00:06:56.063 06:16:25 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:06:56.063 06:16:25 -- ../common.sh@8 -- # pids=() 00:06:56.063 06:16:25 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:06:56.063 06:16:25 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:06:56.063 06:16:25 -- nvmf/run.sh@56 -- # fuzz_num=25 00:06:56.063 06:16:25 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:06:56.063 06:16:25 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:06:56.063 06:16:25 -- nvmf/run.sh@61 -- # mem_size=512 00:06:56.064 06:16:25 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:06:56.064 06:16:25 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:06:56.064 06:16:25 -- ../common.sh@69 -- # local fuzz_num=25 00:06:56.064 06:16:25 -- ../common.sh@70 -- # local time=1 00:06:56.064 06:16:25 -- ../common.sh@72 -- # (( i = 0 )) 00:06:56.064 06:16:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:56.064 06:16:25 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:06:56.064 06:16:25 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:06:56.064 06:16:25 -- nvmf/run.sh@24 -- # local timen=1 00:06:56.064 06:16:25 -- nvmf/run.sh@25 -- # local core=0x1 00:06:56.064 06:16:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:56.064 06:16:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:06:56.064 06:16:25 -- nvmf/run.sh@29 -- # printf %02d 0 00:06:56.064 06:16:25 -- nvmf/run.sh@29 -- # port=4400 00:06:56.064 06:16:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:56.064 06:16:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:06:56.064 06:16:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:56.064 06:16:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:06:56.064 [2024-11-27 06:16:25.530713] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:56.064 [2024-11-27 06:16:25.530785] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid28863 ] 00:06:56.064 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.323 [2024-11-27 06:16:25.708021] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.323 [2024-11-27 06:16:25.772405] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:56.323 [2024-11-27 06:16:25.772538] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.323 [2024-11-27 06:16:25.830988] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:56.323 [2024-11-27 06:16:25.847342] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:06:56.582 INFO: Running with entropic power schedule (0xFF, 100). 00:06:56.582 INFO: Seed: 1613145939 00:06:56.582 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:56.582 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:56.582 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:56.582 INFO: A corpus is not provided, starting from an empty corpus 00:06:56.582 #2 INITED exec/s: 0 rss: 61Mb 00:06:56.582 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:56.582 This may also happen if the target rejected all inputs we tried so far 00:06:56.582 [2024-11-27 06:16:25.923245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:56.582 [2024-11-27 06:16:25.923282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.841 NEW_FUNC[1/671]: 0x43a858 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:06:56.841 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:56.841 #3 NEW cov: 11558 ft: 11559 corp: 2/97b lim: 320 exec/s: 0 rss: 68Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:06:56.841 [2024-11-27 06:16:26.264273] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:56.841 [2024-11-27 06:16:26.264322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.841 #4 NEW cov: 11697 ft: 12205 corp: 3/185b lim: 320 exec/s: 0 rss: 68Mb L: 88/96 MS: 1 InsertRepeatedBytes- 00:06:56.841 [2024-11-27 06:16:26.304281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:2b002500 cdw10:49494949 cdw11:49494949 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.841 [2024-11-27 06:16:26.304312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.841 NEW_FUNC[1/1]: 0x16c4058 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:06:56.841 #8 NEW cov: 11726 ft: 12817 corp: 4/286b lim: 320 exec/s: 0 rss: 68Mb L: 101/101 MS: 4 CMP-InsertRepeatedBytes-InsertByte-InsertRepeatedBytes- DE: "\001\000\000\000\000\000\000\002"- 00:06:56.841 [2024-11-27 06:16:26.344402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:2b002500 cdw10:49494949 cdw11:49494949 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.841 [2024-11-27 06:16:26.344430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.841 #9 NEW cov: 11811 ft: 13013 corp: 5/387b lim: 320 exec/s: 0 rss: 68Mb L: 101/101 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\002"- 00:06:57.101 [2024-11-27 06:16:26.394491] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:57.101 [2024-11-27 06:16:26.394518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.101 #10 NEW cov: 11811 ft: 13177 corp: 6/483b lim: 320 exec/s: 0 rss: 68Mb L: 96/101 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\002"- 00:06:57.101 [2024-11-27 06:16:26.434620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:57.101 [2024-11-27 06:16:26.434650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.101 #11 NEW cov: 11811 ft: 13298 corp: 7/579b lim: 320 exec/s: 0 rss: 69Mb L: 96/101 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\002"- 00:06:57.101 [2024-11-27 06:16:26.474726] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:80000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:57.101 [2024-11-27 06:16:26.474754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.101 #12 NEW cov: 11811 ft: 13367 corp: 8/675b lim: 320 exec/s: 0 rss: 69Mb L: 96/101 MS: 1 ChangeBit- 00:06:57.101 [2024-11-27 06:16:26.514894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:2b002500 cdw10:49494949 cdw11:49494949 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.101 [2024-11-27 06:16:26.514921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.101 #13 NEW cov: 11811 ft: 13395 corp: 9/776b lim: 320 exec/s: 0 rss: 69Mb L: 101/101 MS: 1 ChangeBinInt- 00:06:57.101 [2024-11-27 06:16:26.554869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:57.101 [2024-11-27 06:16:26.554895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.101 #14 NEW cov: 11811 ft: 13513 corp: 10/872b lim: 320 exec/s: 0 rss: 69Mb L: 96/101 MS: 1 ChangeBit- 00:06:57.101 [2024-11-27 06:16:26.595160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:2b002500 cdw10:49494949 cdw11:49494949 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.101 [2024-11-27 06:16:26.595189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.101 #15 NEW cov: 11811 ft: 13560 corp: 11/973b lim: 320 exec/s: 0 rss: 69Mb L: 101/101 MS: 1 ChangeBit- 00:06:57.101 [2024-11-27 06:16:26.635300] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:80000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:57.101 [2024-11-27 06:16:26.635328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.361 #16 NEW cov: 11811 ft: 13578 corp: 12/1069b lim: 320 exec/s: 0 rss: 69Mb L: 96/101 MS: 1 ChangeByte- 00:06:57.361 [2024-11-27 06:16:26.675363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x100000000000000 00:06:57.361 [2024-11-27 06:16:26.675390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.361 #17 NEW cov: 11811 ft: 13610 corp: 13/1166b lim: 320 exec/s: 0 rss: 69Mb L: 97/101 MS: 1 InsertByte- 00:06:57.361 [2024-11-27 06:16:26.715633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:2b002500 cdw10:00000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.361 [2024-11-27 06:16:26.715659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.361 [2024-11-27 06:16:26.715766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:1000000 cdw10:00000000 cdw11:01494949 00:06:57.361 [2024-11-27 06:16:26.715784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.361 #18 NEW cov: 11813 ft: 13847 corp: 14/1356b lim: 320 exec/s: 0 rss: 69Mb L: 190/190 MS: 1 CrossOver- 00:06:57.361 [2024-11-27 06:16:26.755504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:57.361 [2024-11-27 06:16:26.755531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.361 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:57.361 #19 NEW cov: 11836 ft: 13867 corp: 15/1453b lim: 320 exec/s: 0 rss: 69Mb L: 97/190 MS: 1 InsertByte- 00:06:57.361 [2024-11-27 06:16:26.795824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:2b002500 cdw10:49494949 cdw11:49494949 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.361 [2024-11-27 06:16:26.795852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.361 #20 NEW cov: 11836 ft: 13873 corp: 16/1554b lim: 320 exec/s: 0 rss: 69Mb L: 101/190 MS: 1 ChangeBit- 00:06:57.361 [2024-11-27 06:16:26.845838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffff02000000 00:06:57.361 [2024-11-27 06:16:26.845864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.361 #21 NEW cov: 11836 ft: 13899 corp: 17/1650b lim: 320 exec/s: 0 rss: 69Mb L: 96/190 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\002"- 00:06:57.361 [2024-11-27 06:16:26.886176] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:57.361 [2024-11-27 06:16:26.886204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.361 [2024-11-27 06:16:26.886322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:57.361 [2024-11-27 06:16:26.886340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.621 #22 NEW cov: 11836 ft: 13918 corp: 18/1804b lim: 320 exec/s: 22 rss: 69Mb L: 154/190 MS: 1 InsertRepeatedBytes- 00:06:57.621 [2024-11-27 06:16:26.925898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ff01ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:57.621 [2024-11-27 06:16:26.925925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.621 #23 NEW cov: 11836 ft: 13988 corp: 19/1900b lim: 320 exec/s: 23 rss: 69Mb L: 96/190 MS: 1 ChangeByte- 00:06:57.621 [2024-11-27 06:16:26.966390] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:80000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:57.621 [2024-11-27 06:16:26.966418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.621 [2024-11-27 06:16:26.966525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:74747474 cdw11:74747474 00:06:57.621 [2024-11-27 06:16:26.966540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.621 #24 NEW cov: 11836 ft: 14055 corp: 20/2065b lim: 320 exec/s: 24 rss: 69Mb L: 165/190 MS: 1 InsertRepeatedBytes- 00:06:57.621 [2024-11-27 06:16:27.006273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x1ffff 00:06:57.621 [2024-11-27 06:16:27.006303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.621 #25 NEW cov: 11836 ft: 14068 corp: 21/2161b lim: 320 exec/s: 25 rss: 69Mb L: 96/190 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\002"- 00:06:57.621 [2024-11-27 06:16:27.036409] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:57.621 [2024-11-27 06:16:27.036438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.621 #26 NEW cov: 11836 ft: 14090 corp: 22/2257b lim: 320 exec/s: 26 rss: 69Mb L: 96/190 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\002"- 00:06:57.621 [2024-11-27 06:16:27.066498] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:80000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:57.621 [2024-11-27 06:16:27.066524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.621 #27 NEW cov: 11836 ft: 14112 corp: 23/2353b lim: 320 exec/s: 27 rss: 69Mb L: 96/190 MS: 1 ChangeBinInt- 00:06:57.621 [2024-11-27 06:16:27.106520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:57.621 [2024-11-27 06:16:27.106547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.621 #28 NEW cov: 11836 ft: 14135 corp: 24/2450b lim: 320 exec/s: 28 rss: 69Mb L: 97/190 MS: 1 CopyPart- 00:06:57.621 [2024-11-27 06:16:27.147147] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:57.621 [2024-11-27 06:16:27.147174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.621 [2024-11-27 06:16:27.147284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:57.621 [2024-11-27 06:16:27.147300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.621 [2024-11-27 06:16:27.147415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:57.621 [2024-11-27 06:16:27.147430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.881 #29 NEW cov: 11836 ft: 14254 corp: 25/2672b lim: 320 exec/s: 29 rss: 70Mb L: 222/222 MS: 1 InsertRepeatedBytes- 00:06:57.881 [2024-11-27 06:16:27.187045] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:57.881 [2024-11-27 06:16:27.187072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.881 [2024-11-27 06:16:27.187182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:57.881 [2024-11-27 06:16:27.187197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.881 #30 NEW cov: 11836 ft: 14267 corp: 26/2826b lim: 320 exec/s: 30 rss: 70Mb L: 154/222 MS: 1 ChangeByte- 00:06:57.881 [2024-11-27 06:16:27.226957] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:80000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:57.881 [2024-11-27 06:16:27.226984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.881 #31 NEW cov: 11836 ft: 14284 corp: 27/2922b lim: 320 exec/s: 31 rss: 70Mb L: 96/222 MS: 1 ShuffleBytes- 00:06:57.881 [2024-11-27 06:16:27.267008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:57.881 [2024-11-27 06:16:27.267038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.881 #32 NEW cov: 11836 ft: 14288 corp: 28/3018b lim: 320 exec/s: 32 rss: 70Mb L: 96/222 MS: 1 CMP- DE: "G\214\234\355x-\222\000"- 00:06:57.881 [2024-11-27 06:16:27.297241] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:57.881 [2024-11-27 06:16:27.297269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.881 #33 NEW cov: 11836 ft: 14322 corp: 29/3102b lim: 320 exec/s: 33 rss: 70Mb L: 84/222 MS: 1 EraseBytes- 00:06:57.881 [2024-11-27 06:16:27.337354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:2b002500 cdw10:ffffffff cdw11:494901ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.881 [2024-11-27 06:16:27.337381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.881 #34 NEW cov: 11836 ft: 14339 corp: 30/3211b lim: 320 exec/s: 34 rss: 70Mb L: 109/222 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\001"- 00:06:57.881 [2024-11-27 06:16:27.377657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff020000 00:06:57.881 [2024-11-27 06:16:27.377685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.881 [2024-11-27 06:16:27.377815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:57.881 [2024-11-27 06:16:27.377831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.881 #35 NEW cov: 11836 ft: 14409 corp: 31/3383b lim: 320 exec/s: 35 rss: 70Mb L: 172/222 MS: 1 CopyPart- 00:06:58.140 [2024-11-27 06:16:27.427502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ff01ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:58.140 [2024-11-27 06:16:27.427531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.140 #41 NEW cov: 11836 ft: 14423 corp: 32/3479b lim: 320 exec/s: 41 rss: 70Mb L: 96/222 MS: 1 ChangeBit- 00:06:58.140 [2024-11-27 06:16:27.467819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000201000000 00:06:58.140 [2024-11-27 06:16:27.467846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.140 #42 NEW cov: 11836 ft: 14434 corp: 33/3575b lim: 320 exec/s: 42 rss: 70Mb L: 96/222 MS: 1 ChangeBinInt- 00:06:58.140 [2024-11-27 06:16:27.507896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:58.140 [2024-11-27 06:16:27.507923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.140 #43 NEW cov: 11836 ft: 14498 corp: 34/3679b lim: 320 exec/s: 43 rss: 70Mb L: 104/222 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\002"- 00:06:58.140 [2024-11-27 06:16:27.548087] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:80000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:58.140 [2024-11-27 06:16:27.548116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.140 #44 NEW cov: 11836 ft: 14503 corp: 35/3775b lim: 320 exec/s: 44 rss: 70Mb L: 96/222 MS: 1 ShuffleBytes- 00:06:58.140 [2024-11-27 06:16:27.588172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:58.141 [2024-11-27 06:16:27.588203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.141 #45 NEW cov: 11836 ft: 14505 corp: 36/3871b lim: 320 exec/s: 45 rss: 70Mb L: 96/222 MS: 1 ChangeBinInt- 00:06:58.141 [2024-11-27 06:16:27.628342] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:80000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:58.141 [2024-11-27 06:16:27.628387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.141 #46 NEW cov: 11836 ft: 14511 corp: 37/3967b lim: 320 exec/s: 46 rss: 70Mb L: 96/222 MS: 1 CrossOver- 00:06:58.141 [2024-11-27 06:16:27.668455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:2b002500 cdw10:49494949 cdw11:49494949 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.141 [2024-11-27 06:16:27.668483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.400 #47 NEW cov: 11836 ft: 14571 corp: 38/4068b lim: 320 exec/s: 47 rss: 70Mb L: 101/222 MS: 1 ChangeBinInt- 00:06:58.400 [2024-11-27 06:16:27.708190] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000000000000 00:06:58.400 [2024-11-27 06:16:27.708218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.400 #48 NEW cov: 11836 ft: 14591 corp: 39/4164b lim: 320 exec/s: 48 rss: 70Mb L: 96/222 MS: 1 ShuffleBytes- 00:06:58.400 [2024-11-27 06:16:27.748605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:58.400 [2024-11-27 06:16:27.748632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.400 #49 NEW cov: 11836 ft: 14636 corp: 40/4260b lim: 320 exec/s: 49 rss: 70Mb L: 96/222 MS: 1 ShuffleBytes- 00:06:58.400 [2024-11-27 06:16:27.788829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:2b002500 cdw10:49494949 cdw11:49494949 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.400 [2024-11-27 06:16:27.788858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.400 #50 NEW cov: 11836 ft: 14659 corp: 41/4369b lim: 320 exec/s: 50 rss: 70Mb L: 109/222 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\001"- 00:06:58.400 [2024-11-27 06:16:27.828897] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:58.400 [2024-11-27 06:16:27.828926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.400 #51 NEW cov: 11836 ft: 14663 corp: 42/4458b lim: 320 exec/s: 51 rss: 70Mb L: 89/222 MS: 1 InsertByte- 00:06:58.400 [2024-11-27 06:16:27.869026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:2b002500 cdw10:49494949 cdw11:49494949 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.400 [2024-11-27 06:16:27.869052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.400 #52 NEW cov: 11836 ft: 14670 corp: 43/4567b lim: 320 exec/s: 26 rss: 70Mb L: 109/222 MS: 1 ChangeBit- 00:06:58.400 #52 DONE cov: 11836 ft: 14670 corp: 43/4567b lim: 320 exec/s: 26 rss: 70Mb 00:06:58.400 ###### Recommended dictionary. ###### 00:06:58.400 "\001\000\000\000\000\000\000\002" # Uses: 7 00:06:58.400 "G\214\234\355x-\222\000" # Uses: 2 00:06:58.400 "\377\377\377\377\377\377\377\001" # Uses: 1 00:06:58.400 ###### End of recommended dictionary. ###### 00:06:58.400 Done 52 runs in 2 second(s) 00:06:58.660 06:16:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:06:58.660 06:16:28 -- ../common.sh@72 -- # (( i++ )) 00:06:58.660 06:16:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:58.660 06:16:28 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:06:58.660 06:16:28 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:06:58.660 06:16:28 -- nvmf/run.sh@24 -- # local timen=1 00:06:58.660 06:16:28 -- nvmf/run.sh@25 -- # local core=0x1 00:06:58.660 06:16:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:58.660 06:16:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:06:58.660 06:16:28 -- nvmf/run.sh@29 -- # printf %02d 1 00:06:58.660 06:16:28 -- nvmf/run.sh@29 -- # port=4401 00:06:58.660 06:16:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:58.660 06:16:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:06:58.660 06:16:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:58.660 06:16:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:06:58.660 [2024-11-27 06:16:28.041543] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:58.660 [2024-11-27 06:16:28.041610] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid29405 ] 00:06:58.660 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.919 [2024-11-27 06:16:28.214097] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.919 [2024-11-27 06:16:28.277050] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:58.919 [2024-11-27 06:16:28.277177] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.919 [2024-11-27 06:16:28.335180] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:58.919 [2024-11-27 06:16:28.351517] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:06:58.919 INFO: Running with entropic power schedule (0xFF, 100). 00:06:58.919 INFO: Seed: 4115142371 00:06:58.919 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:58.919 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:58.919 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:58.919 INFO: A corpus is not provided, starting from an empty corpus 00:06:58.919 #2 INITED exec/s: 0 rss: 60Mb 00:06:58.919 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:58.919 This may also happen if the target rejected all inputs we tried so far 00:06:58.919 [2024-11-27 06:16:28.399808] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:06:58.919 [2024-11-27 06:16:28.400030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:58.919 [2024-11-27 06:16:28.400057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.180 NEW_FUNC[1/671]: 0x43b158 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:06:59.180 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:59.180 #9 NEW cov: 11639 ft: 11640 corp: 2/8b lim: 30 exec/s: 0 rss: 68Mb L: 7/7 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:06:59.180 [2024-11-27 06:16:28.700582] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:06:59.180 [2024-11-27 06:16:28.700833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.180 [2024-11-27 06:16:28.700876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.440 #10 NEW cov: 11752 ft: 12146 corp: 3/17b lim: 30 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CopyPart- 00:06:59.440 [2024-11-27 06:16:28.750613] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:06:59.440 [2024-11-27 06:16:28.750837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.440 [2024-11-27 06:16:28.750863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.440 #14 NEW cov: 11758 ft: 12413 corp: 4/24b lim: 30 exec/s: 0 rss: 68Mb L: 7/9 MS: 4 EraseBytes-ChangeBinInt-EraseBytes-CrossOver- 00:06:59.440 [2024-11-27 06:16:28.790740] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:06:59.440 [2024-11-27 06:16:28.790980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.440 [2024-11-27 06:16:28.791006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.440 #15 NEW cov: 11843 ft: 12672 corp: 5/31b lim: 30 exec/s: 0 rss: 68Mb L: 7/9 MS: 1 EraseBytes- 00:06:59.440 [2024-11-27 06:16:28.830873] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (128132) > buf size (4096) 00:06:59.440 [2024-11-27 06:16:28.831112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7d200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.440 [2024-11-27 06:16:28.831139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.440 #21 NEW cov: 11843 ft: 12841 corp: 6/39b lim: 30 exec/s: 0 rss: 68Mb L: 8/9 MS: 1 InsertByte- 00:06:59.440 [2024-11-27 06:16:28.870949] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:06:59.440 [2024-11-27 06:16:28.871193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.440 [2024-11-27 06:16:28.871218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.440 #22 NEW cov: 11843 ft: 13014 corp: 7/46b lim: 30 exec/s: 0 rss: 68Mb L: 7/9 MS: 1 CopyPart- 00:06:59.440 [2024-11-27 06:16:28.911094] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:06:59.440 [2024-11-27 06:16:28.911229] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100008585 00:06:59.440 [2024-11-27 06:16:28.911439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.440 [2024-11-27 06:16:28.911466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.440 [2024-11-27 06:16:28.911534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0a208185 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.440 [2024-11-27 06:16:28.911548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.440 #28 NEW cov: 11849 ft: 13490 corp: 8/59b lim: 30 exec/s: 0 rss: 68Mb L: 13/13 MS: 1 InsertRepeatedBytes- 00:06:59.440 [2024-11-27 06:16:28.951194] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (8324) > buf size (4096) 00:06:59.440 [2024-11-27 06:16:28.951430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.440 [2024-11-27 06:16:28.951455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.440 #31 NEW cov: 11849 ft: 13527 corp: 9/67b lim: 30 exec/s: 0 rss: 68Mb L: 8/13 MS: 3 ChangeBit-ChangeBinInt-CrossOver- 00:06:59.701 [2024-11-27 06:16:28.991279] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (8324) > buf size (4096) 00:06:59.701 [2024-11-27 06:16:28.991510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.701 [2024-11-27 06:16:28.991536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.701 #32 NEW cov: 11849 ft: 13557 corp: 10/75b lim: 30 exec/s: 0 rss: 68Mb L: 8/13 MS: 1 ShuffleBytes- 00:06:59.701 [2024-11-27 06:16:29.031417] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (8324) > buf size (4096) 00:06:59.701 [2024-11-27 06:16:29.031632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.701 [2024-11-27 06:16:29.031656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.701 #33 NEW cov: 11849 ft: 13665 corp: 11/83b lim: 30 exec/s: 0 rss: 68Mb L: 8/13 MS: 1 CopyPart- 00:06:59.701 [2024-11-27 06:16:29.071523] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:06:59.701 [2024-11-27 06:16:29.071648] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200a 00:06:59.701 [2024-11-27 06:16:29.071863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.701 [2024-11-27 06:16:29.071888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.701 [2024-11-27 06:16:29.071945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.701 [2024-11-27 06:16:29.071959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.701 #34 NEW cov: 11849 ft: 13713 corp: 12/97b lim: 30 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 CopyPart- 00:06:59.701 [2024-11-27 06:16:29.111648] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (128132) > buf size (4096) 00:06:59.701 [2024-11-27 06:16:29.111882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7d200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.701 [2024-11-27 06:16:29.111908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.701 #35 NEW cov: 11849 ft: 13782 corp: 13/105b lim: 30 exec/s: 0 rss: 68Mb L: 8/14 MS: 1 CopyPart- 00:06:59.701 [2024-11-27 06:16:29.151763] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (652420) > buf size (4096) 00:06:59.701 [2024-11-27 06:16:29.151979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7d200220 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.701 [2024-11-27 06:16:29.152004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.701 #36 NEW cov: 11849 ft: 13815 corp: 14/112b lim: 30 exec/s: 0 rss: 69Mb L: 7/14 MS: 1 EraseBytes- 00:06:59.701 [2024-11-27 06:16:29.191899] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (128132) > buf size (4096) 00:06:59.701 [2024-11-27 06:16:29.192113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7d200024 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.701 [2024-11-27 06:16:29.192138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.701 #37 NEW cov: 11849 ft: 13839 corp: 15/120b lim: 30 exec/s: 0 rss: 69Mb L: 8/14 MS: 1 ChangeBit- 00:06:59.701 [2024-11-27 06:16:29.232066] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (33684) > buf size (4096) 00:06:59.701 [2024-11-27 06:16:29.232208] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (8224) > len (4) 00:06:59.701 [2024-11-27 06:16:29.232423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20e40003 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.701 [2024-11-27 06:16:29.232453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.701 [2024-11-27 06:16:29.232509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.701 [2024-11-27 06:16:29.232524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.961 #38 NEW cov: 11862 ft: 13902 corp: 16/135b lim: 30 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 CMP- DE: "\344\003\000\000\000\000\000\000"- 00:06:59.961 [2024-11-27 06:16:29.272101] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (254848) > buf size (4096) 00:06:59.961 [2024-11-27 06:16:29.272316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:f8df00df cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.961 [2024-11-27 06:16:29.272341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.961 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:59.961 #39 NEW cov: 11885 ft: 13951 corp: 17/143b lim: 30 exec/s: 0 rss: 69Mb L: 8/15 MS: 1 ChangeBinInt- 00:06:59.961 [2024-11-27 06:16:29.312253] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (128132) > buf size (4096) 00:06:59.961 [2024-11-27 06:16:29.312468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7d200023 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.961 [2024-11-27 06:16:29.312495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.961 #40 NEW cov: 11885 ft: 13959 corp: 18/151b lim: 30 exec/s: 0 rss: 69Mb L: 8/15 MS: 1 ChangeBinInt- 00:06:59.961 [2024-11-27 06:16:29.352317] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (33256) > buf size (4096) 00:06:59.961 [2024-11-27 06:16:29.352540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20790020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.961 [2024-11-27 06:16:29.352565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.961 #41 NEW cov: 11885 ft: 14020 corp: 19/158b lim: 30 exec/s: 0 rss: 69Mb L: 7/15 MS: 1 ChangeByte- 00:06:59.961 [2024-11-27 06:16:29.392476] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (652420) > buf size (4096) 00:06:59.961 [2024-11-27 06:16:29.392696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7d200220 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.961 [2024-11-27 06:16:29.392723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.961 #42 NEW cov: 11885 ft: 14041 corp: 20/166b lim: 30 exec/s: 42 rss: 69Mb L: 8/15 MS: 1 CopyPart- 00:06:59.961 [2024-11-27 06:16:29.432575] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (8324) > buf size (4096) 00:06:59.961 [2024-11-27 06:16:29.432811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.961 [2024-11-27 06:16:29.432837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.961 #43 NEW cov: 11885 ft: 14090 corp: 21/174b lim: 30 exec/s: 43 rss: 69Mb L: 8/15 MS: 1 CopyPart- 00:06:59.961 [2024-11-27 06:16:29.472687] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32924) > buf size (4096) 00:06:59.961 [2024-11-27 06:16:29.472902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20260020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.961 [2024-11-27 06:16:29.472928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.961 #44 NEW cov: 11885 ft: 14131 corp: 22/181b lim: 30 exec/s: 44 rss: 69Mb L: 7/15 MS: 1 ChangeBinInt- 00:07:00.220 [2024-11-27 06:16:29.512800] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002017 00:07:00.220 [2024-11-27 06:16:29.513031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7d200220 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.220 [2024-11-27 06:16:29.513056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.220 #45 NEW cov: 11885 ft: 14135 corp: 23/188b lim: 30 exec/s: 45 rss: 69Mb L: 7/15 MS: 1 ChangeBinInt- 00:07:00.220 [2024-11-27 06:16:29.552940] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (33256) > buf size (4096) 00:07:00.220 [2024-11-27 06:16:29.553175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20790020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.220 [2024-11-27 06:16:29.553201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.220 #46 NEW cov: 11885 ft: 14140 corp: 24/195b lim: 30 exec/s: 46 rss: 69Mb L: 7/15 MS: 1 ChangeBit- 00:07:00.220 [2024-11-27 06:16:29.593116] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (233488) > buf size (4096) 00:07:00.220 [2024-11-27 06:16:29.593233] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (8224) > len (4) 00:07:00.220 [2024-11-27 06:16:29.593457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e4030000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.220 [2024-11-27 06:16:29.593483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.220 [2024-11-27 06:16:29.593540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0000002d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.220 [2024-11-27 06:16:29.593554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.220 #49 NEW cov: 11885 ft: 14150 corp: 25/207b lim: 30 exec/s: 49 rss: 69Mb L: 12/15 MS: 3 EraseBytes-ChangeByte-PersAutoDict- DE: "\344\003\000\000\000\000\000\000"- 00:07:00.220 [2024-11-27 06:16:29.633220] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:07:00.220 [2024-11-27 06:16:29.633340] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10372) > buf size (4096) 00:07:00.220 [2024-11-27 06:16:29.633556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.220 [2024-11-27 06:16:29.633582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.220 [2024-11-27 06:16:29.633642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0a200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.220 [2024-11-27 06:16:29.633657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.220 #50 NEW cov: 11885 ft: 14241 corp: 26/220b lim: 30 exec/s: 50 rss: 69Mb L: 13/15 MS: 1 CopyPart- 00:07:00.220 [2024-11-27 06:16:29.673314] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10372) > buf size (4096) 00:07:00.220 [2024-11-27 06:16:29.673547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.220 [2024-11-27 06:16:29.673573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.220 #51 NEW cov: 11885 ft: 14297 corp: 27/228b lim: 30 exec/s: 51 rss: 69Mb L: 8/15 MS: 1 ChangeBit- 00:07:00.220 [2024-11-27 06:16:29.713485] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (128132) > buf size (4096) 00:07:00.220 [2024-11-27 06:16:29.713636] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (33684) > buf size (4096) 00:07:00.220 [2024-11-27 06:16:29.713860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7d200023 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.220 [2024-11-27 06:16:29.713885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.220 [2024-11-27 06:16:29.713944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:20e40003 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.220 [2024-11-27 06:16:29.713958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.220 #52 NEW cov: 11885 ft: 14312 corp: 28/244b lim: 30 exec/s: 52 rss: 70Mb L: 16/16 MS: 1 PersAutoDict- DE: "\344\003\000\000\000\000\000\000"- 00:07:00.221 [2024-11-27 06:16:29.753575] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (8324) > buf size (4096) 00:07:00.221 [2024-11-27 06:16:29.753802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.221 [2024-11-27 06:16:29.753828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.480 #53 NEW cov: 11885 ft: 14314 corp: 29/252b lim: 30 exec/s: 53 rss: 70Mb L: 8/16 MS: 1 CopyPart- 00:07:00.480 [2024-11-27 06:16:29.783619] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (259204) > buf size (4096) 00:07:00.480 [2024-11-27 06:16:29.783858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:fd200024 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.480 [2024-11-27 06:16:29.783882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.480 #54 NEW cov: 11885 ft: 14322 corp: 30/260b lim: 30 exec/s: 54 rss: 70Mb L: 8/16 MS: 1 ChangeBit- 00:07:00.480 [2024-11-27 06:16:29.823766] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:07:00.480 [2024-11-27 06:16:29.823982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20200008 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.480 [2024-11-27 06:16:29.824008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.480 #55 NEW cov: 11885 ft: 14330 corp: 31/268b lim: 30 exec/s: 55 rss: 70Mb L: 8/16 MS: 1 ShuffleBytes- 00:07:00.480 [2024-11-27 06:16:29.863877] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32924) > buf size (4096) 00:07:00.480 [2024-11-27 06:16:29.863998] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11156) > buf size (4096) 00:07:00.480 [2024-11-27 06:16:29.864208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20260020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.480 [2024-11-27 06:16:29.864233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.480 [2024-11-27 06:16:29.864292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0ae40003 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.480 [2024-11-27 06:16:29.864306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.480 #56 NEW cov: 11885 ft: 14356 corp: 32/283b lim: 30 exec/s: 56 rss: 70Mb L: 15/16 MS: 1 PersAutoDict- DE: "\344\003\000\000\000\000\000\000"- 00:07:00.480 [2024-11-27 06:16:29.903993] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (8324) > buf size (4096) 00:07:00.480 [2024-11-27 06:16:29.904214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:082000cb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.480 [2024-11-27 06:16:29.904242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.480 #57 NEW cov: 11885 ft: 14361 corp: 33/292b lim: 30 exec/s: 57 rss: 70Mb L: 9/16 MS: 1 InsertByte- 00:07:00.480 [2024-11-27 06:16:29.944098] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (128132) > buf size (4096) 00:07:00.480 [2024-11-27 06:16:29.944316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7d200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.480 [2024-11-27 06:16:29.944341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.480 #58 NEW cov: 11885 ft: 14372 corp: 34/301b lim: 30 exec/s: 58 rss: 70Mb L: 9/16 MS: 1 InsertByte- 00:07:00.480 [2024-11-27 06:16:29.974202] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32924) > buf size (4096) 00:07:00.480 [2024-11-27 06:16:29.974324] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11156) > buf size (4096) 00:07:00.480 [2024-11-27 06:16:29.974545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20260020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.480 [2024-11-27 06:16:29.974571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.480 [2024-11-27 06:16:29.974628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0ae40003 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.480 [2024-11-27 06:16:29.974643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.480 #59 NEW cov: 11885 ft: 14390 corp: 35/316b lim: 30 exec/s: 59 rss: 70Mb L: 15/16 MS: 1 ChangeBinInt- 00:07:00.480 [2024-11-27 06:16:30.014405] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32924) > buf size (4096) 00:07:00.480 [2024-11-27 06:16:30.014642] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (768) > len (132) 00:07:00.480 [2024-11-27 06:16:30.014860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20260020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.480 [2024-11-27 06:16:30.014887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.480 [2024-11-27 06:16:30.014942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:03000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.480 [2024-11-27 06:16:30.014957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.480 [2024-11-27 06:16:30.015011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0020000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.480 [2024-11-27 06:16:30.015024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.739 #60 NEW cov: 11895 ft: 14730 corp: 36/339b lim: 30 exec/s: 60 rss: 70Mb L: 23/23 MS: 1 PersAutoDict- DE: "\344\003\000\000\000\000\000\000"- 00:07:00.739 [2024-11-27 06:16:30.064475] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002017 00:07:00.739 [2024-11-27 06:16:30.064705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7d200220 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.739 [2024-11-27 06:16:30.064731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.739 #61 NEW cov: 11895 ft: 14738 corp: 37/346b lim: 30 exec/s: 61 rss: 70Mb L: 7/23 MS: 1 CopyPart- 00:07:00.739 [2024-11-27 06:16:30.104643] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (128132) > buf size (4096) 00:07:00.739 [2024-11-27 06:16:30.104767] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (33272) > buf size (4096) 00:07:00.739 [2024-11-27 06:16:30.104993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7d200023 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.739 [2024-11-27 06:16:30.105021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.739 [2024-11-27 06:16:30.105076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:207d000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.739 [2024-11-27 06:16:30.105090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.739 #62 NEW cov: 11895 ft: 14745 corp: 38/363b lim: 30 exec/s: 62 rss: 70Mb L: 17/23 MS: 1 CrossOver- 00:07:00.739 [2024-11-27 06:16:30.144729] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (8324) > buf size (4096) 00:07:00.739 [2024-11-27 06:16:30.144949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.739 [2024-11-27 06:16:30.144976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.739 #63 NEW cov: 11895 ft: 14778 corp: 39/371b lim: 30 exec/s: 63 rss: 70Mb L: 8/23 MS: 1 ShuffleBytes- 00:07:00.739 [2024-11-27 06:16:30.184889] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:07:00.739 [2024-11-27 06:16:30.185010] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200b 00:07:00.739 [2024-11-27 06:16:30.185219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.740 [2024-11-27 06:16:30.185245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.740 [2024-11-27 06:16:30.185300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.740 [2024-11-27 06:16:30.185314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.740 #64 NEW cov: 11895 ft: 14782 corp: 40/385b lim: 30 exec/s: 64 rss: 70Mb L: 14/23 MS: 1 ChangeBit- 00:07:00.740 [2024-11-27 06:16:30.224990] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2026 00:07:00.740 [2024-11-27 06:16:30.225234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.740 [2024-11-27 06:16:30.225260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.740 #65 NEW cov: 11895 ft: 14801 corp: 41/393b lim: 30 exec/s: 65 rss: 70Mb L: 8/23 MS: 1 ChangeByte- 00:07:00.740 [2024-11-27 06:16:30.265121] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:07:00.740 [2024-11-27 06:16:30.265259] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:07:00.740 [2024-11-27 06:16:30.265483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.740 [2024-11-27 06:16:30.265509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.740 [2024-11-27 06:16:30.265565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.740 [2024-11-27 06:16:30.265580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.999 [2024-11-27 06:16:30.305169] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (32900) > buf size (4096) 00:07:00.999 [2024-11-27 06:16:30.305290] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (820092) > buf size (4096) 00:07:00.999 [2024-11-27 06:16:30.305500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:20200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.999 [2024-11-27 06:16:30.305529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.999 [2024-11-27 06:16:30.305585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:20de83df cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.999 [2024-11-27 06:16:30.305603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.999 #67 NEW cov: 11895 ft: 14807 corp: 42/407b lim: 30 exec/s: 67 rss: 70Mb L: 14/23 MS: 2 CopyPart-ChangeBinInt- 00:07:00.999 [2024-11-27 06:16:30.345279] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200a 00:07:00.999 [2024-11-27 06:16:30.345500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08000020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.999 [2024-11-27 06:16:30.345526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.999 #69 NEW cov: 11895 ft: 14884 corp: 43/413b lim: 30 exec/s: 69 rss: 70Mb L: 6/23 MS: 2 EraseBytes-InsertByte- 00:07:00.999 [2024-11-27 06:16:30.385412] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (129156) > buf size (4096) 00:07:00.999 [2024-11-27 06:16:30.385647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7e200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.999 [2024-11-27 06:16:30.385672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.999 #70 NEW cov: 11895 ft: 14885 corp: 44/421b lim: 30 exec/s: 35 rss: 70Mb L: 8/23 MS: 1 ChangeByte- 00:07:00.999 #70 DONE cov: 11895 ft: 14885 corp: 44/421b lim: 30 exec/s: 35 rss: 70Mb 00:07:00.999 ###### Recommended dictionary. ###### 00:07:01.000 "\344\003\000\000\000\000\000\000" # Uses: 4 00:07:01.000 ###### End of recommended dictionary. ###### 00:07:01.000 Done 70 runs in 2 second(s) 00:07:01.000 06:16:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:01.000 06:16:30 -- ../common.sh@72 -- # (( i++ )) 00:07:01.000 06:16:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:01.000 06:16:30 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:01.000 06:16:30 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:01.000 06:16:30 -- nvmf/run.sh@24 -- # local timen=1 00:07:01.000 06:16:30 -- nvmf/run.sh@25 -- # local core=0x1 00:07:01.000 06:16:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:01.000 06:16:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:01.000 06:16:30 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:01.000 06:16:30 -- nvmf/run.sh@29 -- # port=4402 00:07:01.000 06:16:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:01.000 06:16:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:01.000 06:16:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:01.000 06:16:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:01.260 [2024-11-27 06:16:30.558929] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:01.260 [2024-11-27 06:16:30.559005] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid29765 ] 00:07:01.260 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.260 [2024-11-27 06:16:30.739407] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.520 [2024-11-27 06:16:30.804330] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:01.520 [2024-11-27 06:16:30.804459] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.520 [2024-11-27 06:16:30.862437] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:01.520 [2024-11-27 06:16:30.878820] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:01.520 INFO: Running with entropic power schedule (0xFF, 100). 00:07:01.520 INFO: Seed: 2348193615 00:07:01.520 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:01.520 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:01.520 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:01.520 INFO: A corpus is not provided, starting from an empty corpus 00:07:01.520 #2 INITED exec/s: 0 rss: 60Mb 00:07:01.520 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:01.520 This may also happen if the target rejected all inputs we tried so far 00:07:01.520 [2024-11-27 06:16:30.944699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.520 [2024-11-27 06:16:30.944734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.780 NEW_FUNC[1/670]: 0x43db78 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:01.780 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:01.780 #18 NEW cov: 11580 ft: 11581 corp: 2/12b lim: 35 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 InsertRepeatedBytes- 00:07:01.780 [2024-11-27 06:16:31.255705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.780 [2024-11-27 06:16:31.255740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.780 #19 NEW cov: 11693 ft: 12264 corp: 3/23b lim: 35 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 ChangeByte- 00:07:02.039 [2024-11-27 06:16:31.315725] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:02.039 [2024-11-27 06:16:31.316114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.039 [2024-11-27 06:16:31.316145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.039 [2024-11-27 06:16:31.316289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.040 [2024-11-27 06:16:31.316316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.040 #20 NEW cov: 11708 ft: 12923 corp: 4/38b lim: 35 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 CrossOver- 00:07:02.040 [2024-11-27 06:16:31.365832] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:02.040 [2024-11-27 06:16:31.366209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.040 [2024-11-27 06:16:31.366241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.040 [2024-11-27 06:16:31.366377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:8000922d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.040 [2024-11-27 06:16:31.366400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.040 #21 NEW cov: 11793 ft: 13245 corp: 5/57b lim: 35 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 CMP- DE: "\000\222-\200\341\200\210:"- 00:07:02.040 [2024-11-27 06:16:31.416286] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:02.040 [2024-11-27 06:16:31.416703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.040 [2024-11-27 06:16:31.416734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.040 [2024-11-27 06:16:31.416877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a60000a6 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.040 [2024-11-27 06:16:31.416897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.040 [2024-11-27 06:16:31.417034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:2d000092 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.040 [2024-11-27 06:16:31.417066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.040 #22 NEW cov: 11793 ft: 13538 corp: 6/84b lim: 35 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:07:02.040 [2024-11-27 06:16:31.476895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.040 [2024-11-27 06:16:31.476929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.040 [2024-11-27 06:16:31.477082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.040 [2024-11-27 06:16:31.477103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.040 [2024-11-27 06:16:31.477243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.040 [2024-11-27 06:16:31.477263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.040 #23 NEW cov: 11793 ft: 13651 corp: 7/109b lim: 35 exec/s: 0 rss: 68Mb L: 25/27 MS: 1 InsertRepeatedBytes- 00:07:02.040 [2024-11-27 06:16:31.526413] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:02.040 [2024-11-27 06:16:31.526831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.040 [2024-11-27 06:16:31.526862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.040 [2024-11-27 06:16:31.526998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.040 [2024-11-27 06:16:31.527025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.040 #24 NEW cov: 11793 ft: 13701 corp: 8/127b lim: 35 exec/s: 0 rss: 68Mb L: 18/27 MS: 1 InsertRepeatedBytes- 00:07:02.300 [2024-11-27 06:16:31.586559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.586586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.300 #25 NEW cov: 11793 ft: 13754 corp: 9/140b lim: 35 exec/s: 0 rss: 68Mb L: 13/27 MS: 1 EraseBytes- 00:07:02.300 [2024-11-27 06:16:31.637635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.637663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.300 [2024-11-27 06:16:31.637807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a60000a6 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.637826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.300 [2024-11-27 06:16:31.637969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000073 cdw11:92000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.637988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.300 [2024-11-27 06:16:31.638118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:e1800080 cdw11:0000883a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.638135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.300 #26 NEW cov: 11793 ft: 14318 corp: 10/168b lim: 35 exec/s: 0 rss: 68Mb L: 28/28 MS: 1 InsertByte- 00:07:02.300 [2024-11-27 06:16:31.697547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ff00fdff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.697577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.300 [2024-11-27 06:16:31.697723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.697740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.300 [2024-11-27 06:16:31.697847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.697864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.300 #27 NEW cov: 11793 ft: 14349 corp: 11/193b lim: 35 exec/s: 0 rss: 69Mb L: 25/28 MS: 1 ChangeBinInt- 00:07:02.300 [2024-11-27 06:16:31.757299] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:02.300 [2024-11-27 06:16:31.757835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.757865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.300 [2024-11-27 06:16:31.758001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2d000092 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.758027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.300 [2024-11-27 06:16:31.758144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:808800e1 cdw11:2d003a92 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.758164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.300 #28 NEW cov: 11793 ft: 14361 corp: 12/220b lim: 35 exec/s: 0 rss: 69Mb L: 27/28 MS: 1 CopyPart- 00:07:02.300 [2024-11-27 06:16:31.808287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.808314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.300 [2024-11-27 06:16:31.808452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a60000a6 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.808470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.300 [2024-11-27 06:16:31.808607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000073 cdw11:92000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.808626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.300 [2024-11-27 06:16:31.808753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:e1800080 cdw11:ff0088ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.300 [2024-11-27 06:16:31.808770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.560 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:02.560 #29 NEW cov: 11816 ft: 14407 corp: 13/252b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:02.560 [2024-11-27 06:16:31.867473] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:02.560 [2024-11-27 06:16:31.867874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.560 [2024-11-27 06:16:31.867903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.560 [2024-11-27 06:16:31.868034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.560 [2024-11-27 06:16:31.868055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.560 #30 NEW cov: 11816 ft: 14425 corp: 14/267b lim: 35 exec/s: 0 rss: 69Mb L: 15/32 MS: 1 CMP- DE: "\000\000\000\010"- 00:07:02.560 [2024-11-27 06:16:31.917833] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:02.560 [2024-11-27 06:16:31.918359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.560 [2024-11-27 06:16:31.918387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.560 [2024-11-27 06:16:31.918522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ce000092 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.560 [2024-11-27 06:16:31.918547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.560 [2024-11-27 06:16:31.918680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:808800e1 cdw11:2d003a92 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.560 [2024-11-27 06:16:31.918698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.560 #31 NEW cov: 11816 ft: 14450 corp: 15/294b lim: 35 exec/s: 31 rss: 69Mb L: 27/32 MS: 1 ChangeBinInt- 00:07:02.560 [2024-11-27 06:16:31.977956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.560 [2024-11-27 06:16:31.977986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.560 #32 NEW cov: 11816 ft: 14467 corp: 16/307b lim: 35 exec/s: 32 rss: 69Mb L: 13/32 MS: 1 CopyPart- 00:07:02.560 [2024-11-27 06:16:32.038713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ff00fdff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.560 [2024-11-27 06:16:32.038743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.560 [2024-11-27 06:16:32.038879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0000ff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.560 [2024-11-27 06:16:32.038901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.560 [2024-11-27 06:16:32.039045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.560 [2024-11-27 06:16:32.039064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.560 #33 NEW cov: 11816 ft: 14482 corp: 17/332b lim: 35 exec/s: 33 rss: 69Mb L: 25/32 MS: 1 CrossOver- 00:07:02.820 [2024-11-27 06:16:32.098618] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:02.820 [2024-11-27 06:16:32.099037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:0000a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.099067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.820 [2024-11-27 06:16:32.099202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:80e1002d cdw11:3a008088 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.099221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.820 [2024-11-27 06:16:32.099355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:2d000092 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.099379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.820 #34 NEW cov: 11816 ft: 14491 corp: 18/359b lim: 35 exec/s: 34 rss: 69Mb L: 27/32 MS: 1 PersAutoDict- DE: "\000\222-\200\341\200\210:"- 00:07:02.820 [2024-11-27 06:16:32.149283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.149312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.820 [2024-11-27 06:16:32.149457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a60000a6 cdw11:00000080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.149476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.820 [2024-11-27 06:16:32.149612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000073 cdw11:92000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.149631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.820 [2024-11-27 06:16:32.149756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:e1800080 cdw11:0000883a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.149775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.820 #35 NEW cov: 11816 ft: 14504 corp: 19/387b lim: 35 exec/s: 35 rss: 69Mb L: 28/32 MS: 1 ChangeBit- 00:07:02.820 [2024-11-27 06:16:32.198823] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:02.820 [2024-11-27 06:16:32.199571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.199604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.820 [2024-11-27 06:16:32.199745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2d000092 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.199774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.820 [2024-11-27 06:16:32.199916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b8b800e1 cdw11:b800b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.199937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.820 [2024-11-27 06:16:32.200074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:883a0080 cdw11:8000922d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.200093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.820 #36 NEW cov: 11816 ft: 14518 corp: 20/420b lim: 35 exec/s: 36 rss: 69Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:02.820 [2024-11-27 06:16:32.249373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:0000a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.249403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.820 [2024-11-27 06:16:32.249544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:80e1002d cdw11:3a008088 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.249561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.820 [2024-11-27 06:16:32.249704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000080 cdw11:2d000092 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.249722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.820 #37 NEW cov: 11816 ft: 14538 corp: 21/447b lim: 35 exec/s: 37 rss: 69Mb L: 27/33 MS: 1 ChangeBit- 00:07:02.820 [2024-11-27 06:16:32.309004] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:02.820 [2024-11-27 06:16:32.309388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a0000ba cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.309420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.820 [2024-11-27 06:16:32.309561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.820 [2024-11-27 06:16:32.309586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.820 #40 NEW cov: 11816 ft: 14553 corp: 22/461b lim: 35 exec/s: 40 rss: 69Mb L: 14/33 MS: 3 ChangeByte-ChangeBit-CrossOver- 00:07:03.080 [2024-11-27 06:16:32.359149] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:03.080 [2024-11-27 06:16:32.359537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000008 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.080 [2024-11-27 06:16:32.359568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.080 [2024-11-27 06:16:32.359704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.080 [2024-11-27 06:16:32.359724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.080 #41 NEW cov: 11816 ft: 14610 corp: 23/476b lim: 35 exec/s: 41 rss: 69Mb L: 15/33 MS: 1 PersAutoDict- DE: "\000\000\000\010"- 00:07:03.080 [2024-11-27 06:16:32.419564] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:03.081 [2024-11-27 06:16:32.420349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.420379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.081 [2024-11-27 06:16:32.420512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2d000092 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.420533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.081 [2024-11-27 06:16:32.420671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b8b800e1 cdw11:b800bcb8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.420689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.081 [2024-11-27 06:16:32.420827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:883a0080 cdw11:8000922d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.420843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.081 #42 NEW cov: 11816 ft: 14641 corp: 24/509b lim: 35 exec/s: 42 rss: 69Mb L: 33/33 MS: 1 ChangeBit- 00:07:03.081 [2024-11-27 06:16:32.479785] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:03.081 [2024-11-27 06:16:32.480513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.480545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.081 [2024-11-27 06:16:32.480675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ff0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.480695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.081 [2024-11-27 06:16:32.480837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.480856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.081 [2024-11-27 06:16:32.480984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.481003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.081 #43 NEW cov: 11816 ft: 14655 corp: 25/541b lim: 35 exec/s: 43 rss: 69Mb L: 32/33 MS: 1 InsertRepeatedBytes- 00:07:03.081 [2024-11-27 06:16:32.540484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:0000a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.540512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.081 [2024-11-27 06:16:32.540655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:80e1002d cdw11:3a008088 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.540673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.081 [2024-11-27 06:16:32.540796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000080 cdw11:2d000092 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.540816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.081 #44 NEW cov: 11816 ft: 14673 corp: 26/568b lim: 35 exec/s: 44 rss: 70Mb L: 27/33 MS: 1 ChangeBit- 00:07:03.081 [2024-11-27 06:16:32.600024] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:03.081 [2024-11-27 06:16:32.600568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.600600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.081 [2024-11-27 06:16:32.600729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ce000092 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.600744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.081 [2024-11-27 06:16:32.600873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:808800e1 cdw11:2d003a92 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.081 [2024-11-27 06:16:32.600890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.341 #45 NEW cov: 11816 ft: 14697 corp: 27/595b lim: 35 exec/s: 45 rss: 70Mb L: 27/33 MS: 1 ShuffleBytes- 00:07:03.341 [2024-11-27 06:16:32.660466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:8000a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.341 [2024-11-27 06:16:32.660493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.341 [2024-11-27 06:16:32.660633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:922d003a cdw11:800080e1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.341 [2024-11-27 06:16:32.660651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.341 #46 NEW cov: 11816 ft: 14705 corp: 28/612b lim: 35 exec/s: 46 rss: 70Mb L: 17/33 MS: 1 EraseBytes- 00:07:03.341 [2024-11-27 06:16:32.710296] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:03.341 [2024-11-27 06:16:32.710878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.341 [2024-11-27 06:16:32.710908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.341 [2024-11-27 06:16:32.711043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ce000092 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.341 [2024-11-27 06:16:32.711064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.341 [2024-11-27 06:16:32.711200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:808800e1 cdw11:2d003a92 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.341 [2024-11-27 06:16:32.711219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.341 #47 NEW cov: 11816 ft: 14728 corp: 29/639b lim: 35 exec/s: 47 rss: 70Mb L: 27/33 MS: 1 ShuffleBytes- 00:07:03.341 [2024-11-27 06:16:32.760691] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:03.341 [2024-11-27 06:16:32.761396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.341 [2024-11-27 06:16:32.761429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.341 [2024-11-27 06:16:32.761559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ff0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.341 [2024-11-27 06:16:32.761582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.341 [2024-11-27 06:16:32.761734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0100ff cdw11:ff0009ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.341 [2024-11-27 06:16:32.761762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.341 [2024-11-27 06:16:32.761908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.341 [2024-11-27 06:16:32.761928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.341 #48 NEW cov: 11816 ft: 14747 corp: 30/671b lim: 35 exec/s: 48 rss: 70Mb L: 32/33 MS: 1 ChangeBinInt- 00:07:03.341 [2024-11-27 06:16:32.821302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.341 [2024-11-27 06:16:32.821332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.341 [2024-11-27 06:16:32.821483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a60000a6 cdw11:00000080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.341 [2024-11-27 06:16:32.821501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.341 [2024-11-27 06:16:32.821633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000073 cdw11:92000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.341 [2024-11-27 06:16:32.821652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.341 #49 NEW cov: 11816 ft: 14756 corp: 31/694b lim: 35 exec/s: 49 rss: 70Mb L: 23/33 MS: 1 EraseBytes- 00:07:03.601 [2024-11-27 06:16:32.881099] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:03.601 [2024-11-27 06:16:32.881289] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:03.601 [2024-11-27 06:16:32.881463] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:03.601 [2024-11-27 06:16:32.881846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000008 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.601 [2024-11-27 06:16:32.881875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.601 [2024-11-27 06:16:32.882010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.601 [2024-11-27 06:16:32.882033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.601 [2024-11-27 06:16:32.882179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.601 [2024-11-27 06:16:32.882204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.601 [2024-11-27 06:16:32.882341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.601 [2024-11-27 06:16:32.882367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.601 #50 NEW cov: 11816 ft: 14776 corp: 32/723b lim: 35 exec/s: 50 rss: 70Mb L: 29/33 MS: 1 InsertRepeatedBytes- 00:07:03.601 [2024-11-27 06:16:32.931407] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:03.601 [2024-11-27 06:16:32.932011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:0000a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.601 [2024-11-27 06:16:32.932043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.601 [2024-11-27 06:16:32.932202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:4e4e004e cdw11:4e004e4e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.601 [2024-11-27 06:16:32.932224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.601 [2024-11-27 06:16:32.932358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:2d80004e cdw11:8800e180 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.601 [2024-11-27 06:16:32.932379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.601 [2024-11-27 06:16:32.932506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:80000000 cdw11:92000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.601 [2024-11-27 06:16:32.932526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.601 [2024-11-27 06:16:32.932669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:e1880080 cdw11:0000883a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:03.601 [2024-11-27 06:16:32.932691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.601 #51 NEW cov: 11816 ft: 14913 corp: 33/758b lim: 35 exec/s: 25 rss: 70Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:03.601 #51 DONE cov: 11816 ft: 14913 corp: 33/758b lim: 35 exec/s: 25 rss: 70Mb 00:07:03.601 ###### Recommended dictionary. ###### 00:07:03.601 "\000\222-\200\341\200\210:" # Uses: 1 00:07:03.601 "\000\000\000\010" # Uses: 1 00:07:03.601 ###### End of recommended dictionary. ###### 00:07:03.601 Done 51 runs in 2 second(s) 00:07:03.601 06:16:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:03.601 06:16:33 -- ../common.sh@72 -- # (( i++ )) 00:07:03.601 06:16:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:03.601 06:16:33 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:03.601 06:16:33 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:03.601 06:16:33 -- nvmf/run.sh@24 -- # local timen=1 00:07:03.601 06:16:33 -- nvmf/run.sh@25 -- # local core=0x1 00:07:03.601 06:16:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:03.601 06:16:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:03.601 06:16:33 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:03.601 06:16:33 -- nvmf/run.sh@29 -- # port=4403 00:07:03.601 06:16:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:03.601 06:16:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:03.601 06:16:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:03.601 06:16:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:03.601 [2024-11-27 06:16:33.114804] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:03.601 [2024-11-27 06:16:33.114873] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid30239 ] 00:07:03.861 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.861 [2024-11-27 06:16:33.290920] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.861 [2024-11-27 06:16:33.354185] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:03.861 [2024-11-27 06:16:33.354316] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.120 [2024-11-27 06:16:33.412136] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:04.120 [2024-11-27 06:16:33.428461] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:04.120 INFO: Running with entropic power schedule (0xFF, 100). 00:07:04.120 INFO: Seed: 603232642 00:07:04.120 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:04.121 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:04.121 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:04.121 INFO: A corpus is not provided, starting from an empty corpus 00:07:04.121 #2 INITED exec/s: 0 rss: 60Mb 00:07:04.121 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:04.121 This may also happen if the target rejected all inputs we tried so far 00:07:04.380 NEW_FUNC[1/659]: 0x43f858 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:04.380 NEW_FUNC[2/659]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:04.380 #8 NEW cov: 11490 ft: 11465 corp: 2/12b lim: 20 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 InsertRepeatedBytes- 00:07:04.380 #19 NEW cov: 11607 ft: 12409 corp: 3/27b lim: 20 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 CrossOver- 00:07:04.380 #20 NEW cov: 11630 ft: 12844 corp: 4/43b lim: 20 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 InsertByte- 00:07:04.380 #21 NEW cov: 11715 ft: 13086 corp: 5/58b lim: 20 exec/s: 0 rss: 68Mb L: 15/16 MS: 1 ChangeByte- 00:07:04.644 #22 NEW cov: 11715 ft: 13149 corp: 6/66b lim: 20 exec/s: 0 rss: 68Mb L: 8/16 MS: 1 EraseBytes- 00:07:04.644 #23 NEW cov: 11715 ft: 13238 corp: 7/83b lim: 20 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 CrossOver- 00:07:04.644 #24 NEW cov: 11715 ft: 13282 corp: 8/93b lim: 20 exec/s: 0 rss: 68Mb L: 10/17 MS: 1 EraseBytes- 00:07:04.644 #25 NEW cov: 11715 ft: 13370 corp: 9/112b lim: 20 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 CopyPart- 00:07:04.644 #26 NEW cov: 11715 ft: 13401 corp: 10/129b lim: 20 exec/s: 0 rss: 68Mb L: 17/19 MS: 1 ChangeByte- 00:07:04.644 #29 NEW cov: 11715 ft: 13456 corp: 11/148b lim: 20 exec/s: 0 rss: 68Mb L: 19/19 MS: 3 InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:04.644 #30 NEW cov: 11715 ft: 13497 corp: 12/164b lim: 20 exec/s: 0 rss: 68Mb L: 16/19 MS: 1 InsertByte- 00:07:04.903 #31 NEW cov: 11715 ft: 13521 corp: 13/181b lim: 20 exec/s: 0 rss: 68Mb L: 17/19 MS: 1 ShuffleBytes- 00:07:04.903 #32 NEW cov: 11715 ft: 13523 corp: 14/198b lim: 20 exec/s: 0 rss: 69Mb L: 17/19 MS: 1 CopyPart- 00:07:04.903 #33 NEW cov: 11715 ft: 13591 corp: 15/216b lim: 20 exec/s: 0 rss: 69Mb L: 18/19 MS: 1 CopyPart- 00:07:04.903 #35 NEW cov: 11715 ft: 13621 corp: 16/225b lim: 20 exec/s: 0 rss: 69Mb L: 9/19 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:04.903 #36 NEW cov: 11715 ft: 13685 corp: 17/242b lim: 20 exec/s: 0 rss: 69Mb L: 17/19 MS: 1 ChangeBit- 00:07:04.903 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:04.903 #37 NEW cov: 11738 ft: 13722 corp: 18/258b lim: 20 exec/s: 0 rss: 69Mb L: 16/19 MS: 1 ShuffleBytes- 00:07:04.903 #38 NEW cov: 11738 ft: 13798 corp: 19/278b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:05.163 #39 NEW cov: 11738 ft: 13815 corp: 20/295b lim: 20 exec/s: 39 rss: 69Mb L: 17/20 MS: 1 CrossOver- 00:07:05.163 #40 NEW cov: 11738 ft: 13941 corp: 21/311b lim: 20 exec/s: 40 rss: 69Mb L: 16/20 MS: 1 ChangeByte- 00:07:05.163 NEW_FUNC[1/4]: 0x111e188 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:07:05.163 NEW_FUNC[2/4]: 0x111ed08 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:07:05.163 #41 NEW cov: 11822 ft: 14044 corp: 22/331b lim: 20 exec/s: 41 rss: 69Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:05.163 #42 NEW cov: 11822 ft: 14118 corp: 23/340b lim: 20 exec/s: 42 rss: 69Mb L: 9/20 MS: 1 InsertByte- 00:07:05.163 #43 NEW cov: 11822 ft: 14162 corp: 24/359b lim: 20 exec/s: 43 rss: 69Mb L: 19/20 MS: 1 ChangeByte- 00:07:05.163 #44 NEW cov: 11822 ft: 14172 corp: 25/379b lim: 20 exec/s: 44 rss: 69Mb L: 20/20 MS: 1 ChangeByte- 00:07:05.423 #45 NEW cov: 11822 ft: 14196 corp: 26/387b lim: 20 exec/s: 45 rss: 69Mb L: 8/20 MS: 1 ChangeBit- 00:07:05.423 #46 NEW cov: 11822 ft: 14219 corp: 27/406b lim: 20 exec/s: 46 rss: 69Mb L: 19/20 MS: 1 InsertByte- 00:07:05.423 #47 NEW cov: 11822 ft: 14239 corp: 28/415b lim: 20 exec/s: 47 rss: 69Mb L: 9/20 MS: 1 CopyPart- 00:07:05.423 #48 NEW cov: 11822 ft: 14251 corp: 29/430b lim: 20 exec/s: 48 rss: 70Mb L: 15/20 MS: 1 ChangeByte- 00:07:05.423 #49 NEW cov: 11822 ft: 14536 corp: 30/435b lim: 20 exec/s: 49 rss: 70Mb L: 5/20 MS: 1 EraseBytes- 00:07:05.423 #50 NEW cov: 11822 ft: 14540 corp: 31/443b lim: 20 exec/s: 50 rss: 70Mb L: 8/20 MS: 1 CopyPart- 00:07:05.423 #51 NEW cov: 11822 ft: 14576 corp: 32/454b lim: 20 exec/s: 51 rss: 70Mb L: 11/20 MS: 1 ChangeBinInt- 00:07:05.681 #52 NEW cov: 11822 ft: 14585 corp: 33/472b lim: 20 exec/s: 52 rss: 70Mb L: 18/20 MS: 1 ChangeByte- 00:07:05.681 #53 NEW cov: 11822 ft: 14600 corp: 34/482b lim: 20 exec/s: 53 rss: 70Mb L: 10/20 MS: 1 EraseBytes- 00:07:05.681 #54 NEW cov: 11822 ft: 14638 corp: 35/500b lim: 20 exec/s: 54 rss: 70Mb L: 18/20 MS: 1 ChangeByte- 00:07:05.681 #55 NEW cov: 11822 ft: 14653 corp: 36/518b lim: 20 exec/s: 55 rss: 70Mb L: 18/20 MS: 1 CrossOver- 00:07:05.681 #56 NEW cov: 11822 ft: 14657 corp: 37/536b lim: 20 exec/s: 56 rss: 70Mb L: 18/20 MS: 1 ChangeByte- 00:07:05.681 #57 NEW cov: 11822 ft: 14712 corp: 38/556b lim: 20 exec/s: 57 rss: 70Mb L: 20/20 MS: 1 ChangeBit- 00:07:05.681 #58 NEW cov: 11822 ft: 14713 corp: 39/564b lim: 20 exec/s: 58 rss: 70Mb L: 8/20 MS: 1 EraseBytes- 00:07:05.940 #59 NEW cov: 11822 ft: 14735 corp: 40/574b lim: 20 exec/s: 59 rss: 70Mb L: 10/20 MS: 1 ChangeBit- 00:07:05.940 #60 NEW cov: 11822 ft: 14739 corp: 41/593b lim: 20 exec/s: 60 rss: 70Mb L: 19/20 MS: 1 ChangeByte- 00:07:05.940 #66 NEW cov: 11822 ft: 14765 corp: 42/610b lim: 20 exec/s: 66 rss: 70Mb L: 17/20 MS: 1 EraseBytes- 00:07:05.940 #67 NEW cov: 11822 ft: 14771 corp: 43/621b lim: 20 exec/s: 67 rss: 70Mb L: 11/20 MS: 1 InsertByte- 00:07:05.940 #68 NEW cov: 11822 ft: 14779 corp: 44/633b lim: 20 exec/s: 68 rss: 70Mb L: 12/20 MS: 1 CrossOver- 00:07:05.940 #69 NEW cov: 11822 ft: 14781 corp: 45/648b lim: 20 exec/s: 69 rss: 70Mb L: 15/20 MS: 1 ChangeByte- 00:07:06.199 #70 NEW cov: 11822 ft: 14792 corp: 46/667b lim: 20 exec/s: 35 rss: 70Mb L: 19/20 MS: 1 ChangeBinInt- 00:07:06.199 #70 DONE cov: 11822 ft: 14792 corp: 46/667b lim: 20 exec/s: 35 rss: 70Mb 00:07:06.199 Done 70 runs in 2 second(s) 00:07:06.199 06:16:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:06.199 06:16:35 -- ../common.sh@72 -- # (( i++ )) 00:07:06.199 06:16:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:06.199 06:16:35 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:06.199 06:16:35 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:06.199 06:16:35 -- nvmf/run.sh@24 -- # local timen=1 00:07:06.199 06:16:35 -- nvmf/run.sh@25 -- # local core=0x1 00:07:06.199 06:16:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:06.199 06:16:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:06.199 06:16:35 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:06.199 06:16:35 -- nvmf/run.sh@29 -- # port=4404 00:07:06.199 06:16:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:06.199 06:16:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:06.199 06:16:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:06.199 06:16:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:06.199 [2024-11-27 06:16:35.651691] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:06.199 [2024-11-27 06:16:35.651786] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid30781 ] 00:07:06.199 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.458 [2024-11-27 06:16:35.828576] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.458 [2024-11-27 06:16:35.891950] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:06.458 [2024-11-27 06:16:35.892091] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.459 [2024-11-27 06:16:35.949945] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:06.459 [2024-11-27 06:16:35.966298] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:06.459 INFO: Running with entropic power schedule (0xFF, 100). 00:07:06.459 INFO: Seed: 3141210001 00:07:06.718 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:06.718 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:06.718 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:06.718 INFO: A corpus is not provided, starting from an empty corpus 00:07:06.718 #2 INITED exec/s: 0 rss: 60Mb 00:07:06.718 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:06.718 This may also happen if the target rejected all inputs we tried so far 00:07:06.718 [2024-11-27 06:16:36.022008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.718 [2024-11-27 06:16:36.022038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.718 [2024-11-27 06:16:36.022093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.718 [2024-11-27 06:16:36.022107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.718 [2024-11-27 06:16:36.022160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.718 [2024-11-27 06:16:36.022173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.718 [2024-11-27 06:16:36.022224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.718 [2024-11-27 06:16:36.022237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.977 NEW_FUNC[1/671]: 0x440958 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:06.977 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:06.977 #4 NEW cov: 11598 ft: 11602 corp: 2/30b lim: 35 exec/s: 0 rss: 68Mb L: 29/29 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:06.977 [2024-11-27 06:16:36.342574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.977 [2024-11-27 06:16:36.342616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.977 [2024-11-27 06:16:36.342686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.342703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.978 [2024-11-27 06:16:36.342762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.342791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.978 #5 NEW cov: 11714 ft: 12354 corp: 3/57b lim: 35 exec/s: 0 rss: 68Mb L: 27/29 MS: 1 EraseBytes- 00:07:06.978 [2024-11-27 06:16:36.392775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.392800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.978 [2024-11-27 06:16:36.392868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.392882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.978 [2024-11-27 06:16:36.392933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.392946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.978 [2024-11-27 06:16:36.392997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.393009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.978 #6 NEW cov: 11720 ft: 12622 corp: 4/85b lim: 35 exec/s: 0 rss: 68Mb L: 28/29 MS: 1 InsertByte- 00:07:06.978 [2024-11-27 06:16:36.432851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.432875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.978 [2024-11-27 06:16:36.432926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.432940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.978 [2024-11-27 06:16:36.432991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.433003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.978 [2024-11-27 06:16:36.433055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff3b0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.433068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.978 #7 NEW cov: 11805 ft: 12867 corp: 5/113b lim: 35 exec/s: 0 rss: 68Mb L: 28/29 MS: 1 ChangeByte- 00:07:06.978 [2024-11-27 06:16:36.472981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.473006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.978 [2024-11-27 06:16:36.473057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.473071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.978 [2024-11-27 06:16:36.473120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.473133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.978 [2024-11-27 06:16:36.473184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ff58ffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.978 [2024-11-27 06:16:36.473199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.978 #8 NEW cov: 11805 ft: 13035 corp: 6/147b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:07.238 [2024-11-27 06:16:36.513066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.513093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.513161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff5cff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.513174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.513226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.513238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.513289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.513302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.238 #9 NEW cov: 11805 ft: 13076 corp: 7/175b lim: 35 exec/s: 0 rss: 68Mb L: 28/34 MS: 1 ChangeByte- 00:07:07.238 [2024-11-27 06:16:36.553172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.553197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.553252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.553265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.553316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.553329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.553381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:7fffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.553394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.238 #10 NEW cov: 11805 ft: 13170 corp: 8/203b lim: 35 exec/s: 0 rss: 68Mb L: 28/34 MS: 1 ChangeBit- 00:07:07.238 [2024-11-27 06:16:36.593149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.593174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.593227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.593241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.593292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.593308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.238 #16 NEW cov: 11805 ft: 13294 corp: 9/227b lim: 35 exec/s: 0 rss: 68Mb L: 24/34 MS: 1 CrossOver- 00:07:07.238 [2024-11-27 06:16:36.633420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.633446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.633500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.633514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.633566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.633579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.633634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.633647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.238 #17 NEW cov: 11805 ft: 13322 corp: 10/260b lim: 35 exec/s: 0 rss: 68Mb L: 33/34 MS: 1 CMP- DE: "\000\000\000\003"- 00:07:07.238 [2024-11-27 06:16:36.673559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.673585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.673643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.673657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.673708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.673721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.673771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:fffffbff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.673783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.238 #18 NEW cov: 11805 ft: 13348 corp: 11/289b lim: 35 exec/s: 0 rss: 68Mb L: 29/34 MS: 1 ChangeBit- 00:07:07.238 [2024-11-27 06:16:36.713673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.713698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.713752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.713765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.713815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.713830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.713881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.713893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.238 #19 NEW cov: 11805 ft: 13374 corp: 12/323b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 CrossOver- 00:07:07.238 [2024-11-27 06:16:36.753828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.753853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.753921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.753935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.753986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.753998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.238 [2024-11-27 06:16:36.754051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ff58ffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.238 [2024-11-27 06:16:36.754064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.498 #20 NEW cov: 11805 ft: 13391 corp: 13/357b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 CopyPart- 00:07:07.498 [2024-11-27 06:16:36.793729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:0a3f0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.498 [2024-11-27 06:16:36.793754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.498 [2024-11-27 06:16:36.793808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.498 [2024-11-27 06:16:36.793821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.498 [2024-11-27 06:16:36.793873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.498 [2024-11-27 06:16:36.793885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.498 #21 NEW cov: 11805 ft: 13423 corp: 14/381b lim: 35 exec/s: 0 rss: 69Mb L: 24/34 MS: 1 ChangeByte- 00:07:07.498 [2024-11-27 06:16:36.833752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff7aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.498 [2024-11-27 06:16:36.833777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.498 [2024-11-27 06:16:36.833847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.498 [2024-11-27 06:16:36.833861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.498 #23 NEW cov: 11805 ft: 13704 corp: 15/397b lim: 35 exec/s: 0 rss: 69Mb L: 16/34 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:07.498 [2024-11-27 06:16:36.874147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.498 [2024-11-27 06:16:36.874172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.498 [2024-11-27 06:16:36.874224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffef cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.498 [2024-11-27 06:16:36.874237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.498 [2024-11-27 06:16:36.874288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.498 [2024-11-27 06:16:36.874301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.498 [2024-11-27 06:16:36.874352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.498 [2024-11-27 06:16:36.874364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.498 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:07.498 #24 NEW cov: 11828 ft: 13715 corp: 16/431b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 ChangeBit- 00:07:07.498 [2024-11-27 06:16:36.914264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.498 [2024-11-27 06:16:36.914290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.498 [2024-11-27 06:16:36.914342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.498 [2024-11-27 06:16:36.914356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.498 [2024-11-27 06:16:36.914407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.498 [2024-11-27 06:16:36.914420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.498 [2024-11-27 06:16:36.914472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.499 [2024-11-27 06:16:36.914484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.499 #25 NEW cov: 11828 ft: 13742 corp: 17/464b lim: 35 exec/s: 0 rss: 69Mb L: 33/34 MS: 1 ShuffleBytes- 00:07:07.499 [2024-11-27 06:16:36.954391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.499 [2024-11-27 06:16:36.954416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.499 [2024-11-27 06:16:36.954468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff5cff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.499 [2024-11-27 06:16:36.954482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.499 [2024-11-27 06:16:36.954530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.499 [2024-11-27 06:16:36.954543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.499 [2024-11-27 06:16:36.954603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.499 [2024-11-27 06:16:36.954616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.499 #26 NEW cov: 11828 ft: 13757 corp: 18/492b lim: 35 exec/s: 0 rss: 69Mb L: 28/34 MS: 1 CrossOver- 00:07:07.499 [2024-11-27 06:16:36.994542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:0a3f0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.499 [2024-11-27 06:16:36.994567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.499 [2024-11-27 06:16:36.994621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.499 [2024-11-27 06:16:36.994653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.499 [2024-11-27 06:16:36.994708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.499 [2024-11-27 06:16:36.994721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.499 [2024-11-27 06:16:36.994774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.499 [2024-11-27 06:16:36.994787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.499 #27 NEW cov: 11828 ft: 13780 corp: 19/525b lim: 35 exec/s: 27 rss: 69Mb L: 33/34 MS: 1 CopyPart- 00:07:07.758 [2024-11-27 06:16:37.034670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.034695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.758 [2024-11-27 06:16:37.034766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.034781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.758 [2024-11-27 06:16:37.034834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.034847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.758 [2024-11-27 06:16:37.034899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.034912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.758 #28 NEW cov: 11828 ft: 13793 corp: 20/553b lim: 35 exec/s: 28 rss: 69Mb L: 28/34 MS: 1 CrossOver- 00:07:07.758 [2024-11-27 06:16:37.064780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.064805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.758 [2024-11-27 06:16:37.064874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:000040ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.064888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.758 [2024-11-27 06:16:37.064942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.064955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.758 [2024-11-27 06:16:37.065006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ff58ffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.065019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.758 #29 NEW cov: 11828 ft: 13877 corp: 21/587b lim: 35 exec/s: 29 rss: 69Mb L: 34/34 MS: 1 ChangeByte- 00:07:07.758 [2024-11-27 06:16:37.104845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:23ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.104871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.758 [2024-11-27 06:16:37.104925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.104939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.758 [2024-11-27 06:16:37.104993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff580003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.105006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.758 [2024-11-27 06:16:37.105060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.105073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.758 #30 NEW cov: 11828 ft: 13901 corp: 22/616b lim: 35 exec/s: 30 rss: 69Mb L: 29/34 MS: 1 InsertByte- 00:07:07.758 [2024-11-27 06:16:37.144973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.144998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.758 [2024-11-27 06:16:37.145069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000ff01 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.758 [2024-11-27 06:16:37.145083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.758 [2024-11-27 06:16:37.145136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffff0004 cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.145150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.759 [2024-11-27 06:16:37.145203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff3b0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.145217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.759 #31 NEW cov: 11828 ft: 13917 corp: 23/644b lim: 35 exec/s: 31 rss: 69Mb L: 28/34 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\004"- 00:07:07.759 [2024-11-27 06:16:37.185100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.185124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.759 [2024-11-27 06:16:37.185176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.185189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.759 [2024-11-27 06:16:37.185242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.185255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.759 [2024-11-27 06:16:37.185308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffff58 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.185320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.759 #32 NEW cov: 11828 ft: 13973 corp: 24/676b lim: 35 exec/s: 32 rss: 69Mb L: 32/34 MS: 1 EraseBytes- 00:07:07.759 [2024-11-27 06:16:37.225361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.225386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.759 [2024-11-27 06:16:37.225439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b4b4b4b4 cdw11:b4b40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.225453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.759 [2024-11-27 06:16:37.225505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffff5cff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.225518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.759 [2024-11-27 06:16:37.225570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.225582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.759 [2024-11-27 06:16:37.225653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.225666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:07.759 #33 NEW cov: 11828 ft: 14039 corp: 25/711b lim: 35 exec/s: 33 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:07.759 [2024-11-27 06:16:37.265316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.265340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.759 [2024-11-27 06:16:37.265392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff5cff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.265406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.759 [2024-11-27 06:16:37.265456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.265469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.759 [2024-11-27 06:16:37.265519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.759 [2024-11-27 06:16:37.265534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.759 #34 NEW cov: 11828 ft: 14106 corp: 26/739b lim: 35 exec/s: 34 rss: 70Mb L: 28/35 MS: 1 ShuffleBytes- 00:07:08.019 [2024-11-27 06:16:37.305284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.305309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.019 [2024-11-27 06:16:37.305361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.305374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.019 [2024-11-27 06:16:37.305428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.305441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.019 #35 NEW cov: 11828 ft: 14193 corp: 27/763b lim: 35 exec/s: 35 rss: 70Mb L: 24/35 MS: 1 ChangeBit- 00:07:08.019 [2024-11-27 06:16:37.345731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.345755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.019 [2024-11-27 06:16:37.345825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b4b4b4b4 cdw11:b4b40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.345839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.019 [2024-11-27 06:16:37.345891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffff5cff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.345904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.019 [2024-11-27 06:16:37.345957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.345970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.019 [2024-11-27 06:16:37.346022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.346035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:08.019 #36 NEW cov: 11828 ft: 14211 corp: 28/798b lim: 35 exec/s: 36 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:08.019 [2024-11-27 06:16:37.385527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:23ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.385552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.019 [2024-11-27 06:16:37.385610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.385639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.019 [2024-11-27 06:16:37.385691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffff58 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.385708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.019 #37 NEW cov: 11828 ft: 14224 corp: 29/823b lim: 35 exec/s: 37 rss: 70Mb L: 25/35 MS: 1 EraseBytes- 00:07:08.019 [2024-11-27 06:16:37.425803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.425827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.019 [2024-11-27 06:16:37.425896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff5cff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.425910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.019 [2024-11-27 06:16:37.425963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.425976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.019 [2024-11-27 06:16:37.426026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.019 [2024-11-27 06:16:37.426039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.019 #38 NEW cov: 11828 ft: 14245 corp: 30/851b lim: 35 exec/s: 38 rss: 70Mb L: 28/35 MS: 1 ShuffleBytes- 00:07:08.020 [2024-11-27 06:16:37.465779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.020 [2024-11-27 06:16:37.465803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.020 [2024-11-27 06:16:37.465854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.020 [2024-11-27 06:16:37.465867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.020 [2024-11-27 06:16:37.465918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff480003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.020 [2024-11-27 06:16:37.465930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.020 #39 NEW cov: 11828 ft: 14290 corp: 31/875b lim: 35 exec/s: 39 rss: 70Mb L: 24/35 MS: 1 ChangeByte- 00:07:08.020 [2024-11-27 06:16:37.506024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.020 [2024-11-27 06:16:37.506048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.020 [2024-11-27 06:16:37.506100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.020 [2024-11-27 06:16:37.506113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.020 [2024-11-27 06:16:37.506159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.020 [2024-11-27 06:16:37.506171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.020 [2024-11-27 06:16:37.506219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.020 [2024-11-27 06:16:37.506234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.020 #40 NEW cov: 11828 ft: 14308 corp: 32/909b lim: 35 exec/s: 40 rss: 70Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:07:08.020 [2024-11-27 06:16:37.546335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.020 [2024-11-27 06:16:37.546360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.020 [2024-11-27 06:16:37.546411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff00ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.020 [2024-11-27 06:16:37.546425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.020 [2024-11-27 06:16:37.546475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.020 [2024-11-27 06:16:37.546489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.020 [2024-11-27 06:16:37.546541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.020 [2024-11-27 06:16:37.546554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.020 [2024-11-27 06:16:37.546606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.020 [2024-11-27 06:16:37.546619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:08.280 #41 NEW cov: 11828 ft: 14318 corp: 33/944b lim: 35 exec/s: 41 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:07:08.280 [2024-11-27 06:16:37.586262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:0a3f0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.586286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.586340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.586353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.586404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.586417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.586466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.586478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.280 #42 NEW cov: 11828 ft: 14324 corp: 34/977b lim: 35 exec/s: 42 rss: 70Mb L: 33/35 MS: 1 ChangeByte- 00:07:08.280 [2024-11-27 06:16:37.626368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.626393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.626445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff5cff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.626462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.626512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffff03ff cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.626525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.626574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.626586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.280 #43 NEW cov: 11828 ft: 14364 corp: 35/1005b lim: 35 exec/s: 43 rss: 70Mb L: 28/35 MS: 1 PersAutoDict- DE: "\000\000\000\003"- 00:07:08.280 [2024-11-27 06:16:37.666514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.666538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.666591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.666608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.666673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.666687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.666736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:fffffbff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.666749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.280 #44 NEW cov: 11828 ft: 14384 corp: 36/1034b lim: 35 exec/s: 44 rss: 70Mb L: 29/35 MS: 1 ShuffleBytes- 00:07:08.280 [2024-11-27 06:16:37.706783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.706808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.706860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.706874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.706923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:58ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.706951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.707002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.707015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.707065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ff3b0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.707081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:08.280 #45 NEW cov: 11828 ft: 14410 corp: 37/1069b lim: 35 exec/s: 45 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:07:08.280 [2024-11-27 06:16:37.746748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.746773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.746824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0003 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.746837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.746886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.746899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.746951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fbff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.746963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.280 #46 NEW cov: 11828 ft: 14414 corp: 38/1102b lim: 35 exec/s: 46 rss: 70Mb L: 33/35 MS: 1 PersAutoDict- DE: "\000\000\000\003"- 00:07:08.280 [2024-11-27 06:16:37.786735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0bff cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.786759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.280 [2024-11-27 06:16:37.786811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.280 [2024-11-27 06:16:37.786824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.281 [2024-11-27 06:16:37.786875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.281 [2024-11-27 06:16:37.786888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.281 #47 NEW cov: 11828 ft: 14420 corp: 39/1126b lim: 35 exec/s: 47 rss: 70Mb L: 24/35 MS: 1 ChangeBit- 00:07:08.540 [2024-11-27 06:16:37.826971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.540 [2024-11-27 06:16:37.826997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.540 [2024-11-27 06:16:37.827049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.540 [2024-11-27 06:16:37.827062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.540 [2024-11-27 06:16:37.827113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.540 [2024-11-27 06:16:37.827126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.540 [2024-11-27 06:16:37.827176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.540 [2024-11-27 06:16:37.827192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.540 #48 NEW cov: 11828 ft: 14428 corp: 40/1160b lim: 35 exec/s: 48 rss: 70Mb L: 34/35 MS: 1 ChangeByte- 00:07:08.540 [2024-11-27 06:16:37.867104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0bff cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.540 [2024-11-27 06:16:37.867128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.540 [2024-11-27 06:16:37.867181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.540 [2024-11-27 06:16:37.867194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.540 [2024-11-27 06:16:37.867246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0000ff01 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.540 [2024-11-27 06:16:37.867259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.540 [2024-11-27 06:16:37.867310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffff0004 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.540 [2024-11-27 06:16:37.867323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.540 #49 NEW cov: 11828 ft: 14431 corp: 41/1192b lim: 35 exec/s: 49 rss: 70Mb L: 32/35 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\004"- 00:07:08.541 [2024-11-27 06:16:37.907184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:37.907208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.541 [2024-11-27 06:16:37.907260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffff3a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:37.907273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.541 [2024-11-27 06:16:37.907322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:37.907335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.541 [2024-11-27 06:16:37.907386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:37.907398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.541 #50 NEW cov: 11828 ft: 14458 corp: 42/1226b lim: 35 exec/s: 50 rss: 70Mb L: 34/35 MS: 1 ChangeByte- 00:07:08.541 [2024-11-27 06:16:37.947292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:37.947316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.541 [2024-11-27 06:16:37.947369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:37.947382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.541 [2024-11-27 06:16:37.947432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:37.947448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.541 [2024-11-27 06:16:37.947500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:37.947512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.541 #51 NEW cov: 11828 ft: 14464 corp: 43/1260b lim: 35 exec/s: 51 rss: 70Mb L: 34/35 MS: 1 CrossOver- 00:07:08.541 [2024-11-27 06:16:37.987425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:37.987451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.541 [2024-11-27 06:16:37.987503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:37.987516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.541 [2024-11-27 06:16:37.987568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:37.987581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.541 [2024-11-27 06:16:37.987653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:04000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:37.987666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.541 #52 NEW cov: 11828 ft: 14465 corp: 44/1293b lim: 35 exec/s: 52 rss: 70Mb L: 33/35 MS: 1 CMP- DE: "\004\000\000\000"- 00:07:08.541 [2024-11-27 06:16:38.027517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:38.027541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.541 [2024-11-27 06:16:38.027612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:60ffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:38.027627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.541 [2024-11-27 06:16:38.027678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:38.027691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.541 [2024-11-27 06:16:38.027744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.541 [2024-11-27 06:16:38.027768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.541 #53 NEW cov: 11828 ft: 14474 corp: 45/1327b lim: 35 exec/s: 26 rss: 70Mb L: 34/35 MS: 1 ChangeByte- 00:07:08.541 #53 DONE cov: 11828 ft: 14474 corp: 45/1327b lim: 35 exec/s: 26 rss: 70Mb 00:07:08.541 ###### Recommended dictionary. ###### 00:07:08.541 "\000\000\000\003" # Uses: 2 00:07:08.541 "\001\000\000\000\000\000\000\004" # Uses: 1 00:07:08.541 "\004\000\000\000" # Uses: 0 00:07:08.541 ###### End of recommended dictionary. ###### 00:07:08.541 Done 53 runs in 2 second(s) 00:07:08.800 06:16:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:08.800 06:16:38 -- ../common.sh@72 -- # (( i++ )) 00:07:08.800 06:16:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:08.800 06:16:38 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:08.800 06:16:38 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:08.800 06:16:38 -- nvmf/run.sh@24 -- # local timen=1 00:07:08.800 06:16:38 -- nvmf/run.sh@25 -- # local core=0x1 00:07:08.800 06:16:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:08.800 06:16:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:08.800 06:16:38 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:08.800 06:16:38 -- nvmf/run.sh@29 -- # port=4405 00:07:08.800 06:16:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:08.800 06:16:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:08.800 06:16:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:08.800 06:16:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:08.800 [2024-11-27 06:16:38.192542] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:08.800 [2024-11-27 06:16:38.192606] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid31082 ] 00:07:08.800 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.060 [2024-11-27 06:16:38.375615] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.060 [2024-11-27 06:16:38.439595] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:09.060 [2024-11-27 06:16:38.439732] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.060 [2024-11-27 06:16:38.497867] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:09.060 [2024-11-27 06:16:38.514238] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:09.060 INFO: Running with entropic power schedule (0xFF, 100). 00:07:09.060 INFO: Seed: 1395248198 00:07:09.060 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:09.060 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:09.060 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:09.060 INFO: A corpus is not provided, starting from an empty corpus 00:07:09.060 #2 INITED exec/s: 0 rss: 60Mb 00:07:09.060 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:09.060 This may also happen if the target rejected all inputs we tried so far 00:07:09.060 [2024-11-27 06:16:38.590257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.060 [2024-11-27 06:16:38.590296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.578 NEW_FUNC[1/671]: 0x442af8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:09.578 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:09.578 #14 NEW cov: 11598 ft: 11613 corp: 2/18b lim: 45 exec/s: 0 rss: 68Mb L: 17/17 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:09.578 [2024-11-27 06:16:38.911252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.578 [2024-11-27 06:16:38.911290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.578 [2024-11-27 06:16:38.911411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.578 [2024-11-27 06:16:38.911428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.578 #15 NEW cov: 11725 ft: 12883 corp: 3/43b lim: 45 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 CMP- DE: "\001\000\177\024\374\000 S"- 00:07:09.578 [2024-11-27 06:16:38.961651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.578 [2024-11-27 06:16:38.961681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.578 [2024-11-27 06:16:38.961805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.578 [2024-11-27 06:16:38.961824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.578 [2024-11-27 06:16:38.961938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.578 [2024-11-27 06:16:38.961955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.578 #18 NEW cov: 11731 ft: 13438 corp: 4/76b lim: 45 exec/s: 0 rss: 68Mb L: 33/33 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:09.578 [2024-11-27 06:16:39.001482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fff8ffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.578 [2024-11-27 06:16:39.001512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.578 [2024-11-27 06:16:39.001622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.578 [2024-11-27 06:16:39.001651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.578 #19 NEW cov: 11816 ft: 13679 corp: 5/101b lim: 45 exec/s: 0 rss: 68Mb L: 25/33 MS: 1 ChangeBinInt- 00:07:09.578 [2024-11-27 06:16:39.051348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.579 [2024-11-27 06:16:39.051375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.579 #20 NEW cov: 11816 ft: 13773 corp: 6/112b lim: 45 exec/s: 0 rss: 68Mb L: 11/33 MS: 1 EraseBytes- 00:07:09.579 [2024-11-27 06:16:39.091803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.579 [2024-11-27 06:16:39.091831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.579 [2024-11-27 06:16:39.091949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:01000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.579 [2024-11-27 06:16:39.091967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.579 #21 NEW cov: 11816 ft: 13904 corp: 7/137b lim: 45 exec/s: 0 rss: 68Mb L: 25/33 MS: 1 PersAutoDict- DE: "\001\000\177\024\374\000 S"- 00:07:09.837 [2024-11-27 06:16:39.131828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.131855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.837 [2024-11-27 06:16:39.131969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.131987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.837 #22 NEW cov: 11816 ft: 13962 corp: 8/162b lim: 45 exec/s: 0 rss: 68Mb L: 25/33 MS: 1 PersAutoDict- DE: "\001\000\177\024\374\000 S"- 00:07:09.837 [2024-11-27 06:16:39.171989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.172017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.837 [2024-11-27 06:16:39.172137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.172165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.837 #23 NEW cov: 11816 ft: 13995 corp: 9/187b lim: 45 exec/s: 0 rss: 68Mb L: 25/33 MS: 1 PersAutoDict- DE: "\001\000\177\024\374\000 S"- 00:07:09.837 [2024-11-27 06:16:39.212110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.212139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.837 [2024-11-27 06:16:39.212260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.212279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.837 #24 NEW cov: 11816 ft: 14029 corp: 10/209b lim: 45 exec/s: 0 rss: 68Mb L: 22/33 MS: 1 CopyPart- 00:07:09.837 [2024-11-27 06:16:39.252808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.252835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.837 [2024-11-27 06:16:39.252959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.252978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.837 [2024-11-27 06:16:39.253093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.253110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.837 [2024-11-27 06:16:39.253228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7f140100 cdw11:fc000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.253246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.837 #25 NEW cov: 11816 ft: 14444 corp: 11/245b lim: 45 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 CrossOver- 00:07:09.837 [2024-11-27 06:16:39.301767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:01000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.301796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.837 #26 NEW cov: 11816 ft: 14475 corp: 12/262b lim: 45 exec/s: 0 rss: 68Mb L: 17/36 MS: 1 PersAutoDict- DE: "\001\000\177\024\374\000 S"- 00:07:09.837 [2024-11-27 06:16:39.342617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.342648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.837 [2024-11-27 06:16:39.342771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.342788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.837 [2024-11-27 06:16:39.342896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0101ffff cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.342913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.837 [2024-11-27 06:16:39.343046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.837 [2024-11-27 06:16:39.343064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.837 #27 NEW cov: 11816 ft: 14495 corp: 13/304b lim: 45 exec/s: 0 rss: 68Mb L: 42/42 MS: 1 CopyPart- 00:07:10.096 [2024-11-27 06:16:39.382407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.382436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.096 [2024-11-27 06:16:39.382555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.382572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.096 #28 NEW cov: 11816 ft: 14531 corp: 14/327b lim: 45 exec/s: 0 rss: 69Mb L: 23/42 MS: 1 InsertByte- 00:07:10.096 [2024-11-27 06:16:39.432666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fff8ffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.432702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.096 [2024-11-27 06:16:39.432832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ff01ffff cdw11:007f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.432850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.096 [2024-11-27 06:16:39.432969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffff2053 cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.432987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.096 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:10.096 #29 NEW cov: 11839 ft: 14565 corp: 15/360b lim: 45 exec/s: 0 rss: 69Mb L: 33/42 MS: 1 PersAutoDict- DE: "\001\000\177\024\374\000 S"- 00:07:10.096 [2024-11-27 06:16:39.482229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.482257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.096 #30 NEW cov: 11839 ft: 14580 corp: 16/371b lim: 45 exec/s: 0 rss: 69Mb L: 11/42 MS: 1 CMP- DE: "\377\377\001\000"- 00:07:10.096 [2024-11-27 06:16:39.523204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fff8ffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.523235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.096 [2024-11-27 06:16:39.523362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.523380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.096 [2024-11-27 06:16:39.523508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff01ffff cdw11:007f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.523526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.096 #31 NEW cov: 11839 ft: 14612 corp: 17/402b lim: 45 exec/s: 0 rss: 69Mb L: 31/42 MS: 1 CopyPart- 00:07:10.096 [2024-11-27 06:16:39.562872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.562900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.096 #32 NEW cov: 11839 ft: 14639 corp: 18/419b lim: 45 exec/s: 32 rss: 69Mb L: 17/42 MS: 1 ChangeBit- 00:07:10.096 [2024-11-27 06:16:39.603527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:01000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.603556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.096 [2024-11-27 06:16:39.603676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.603694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.096 [2024-11-27 06:16:39.603810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fc007f14 cdw11:20530007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.096 [2024-11-27 06:16:39.603826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.096 #33 NEW cov: 11839 ft: 14707 corp: 19/448b lim: 45 exec/s: 33 rss: 69Mb L: 29/42 MS: 1 CrossOver- 00:07:10.356 [2024-11-27 06:16:39.653319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.653349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.356 [2024-11-27 06:16:39.653478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.653497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.356 #34 NEW cov: 11839 ft: 14732 corp: 20/467b lim: 45 exec/s: 34 rss: 69Mb L: 19/42 MS: 1 EraseBytes- 00:07:10.356 [2024-11-27 06:16:39.693979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:01000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.694008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.356 [2024-11-27 06:16:39.694130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.694149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.356 [2024-11-27 06:16:39.694271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fc007f14 cdw11:20530007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.694291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.356 [2024-11-27 06:16:39.694388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:2d800092 cdw11:24a90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.694406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.356 #35 NEW cov: 11839 ft: 14737 corp: 21/504b lim: 45 exec/s: 35 rss: 69Mb L: 37/42 MS: 1 CMP- DE: "\000\222-\200$\251\367\212"- 00:07:10.356 [2024-11-27 06:16:39.743819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:01000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.743848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.356 [2024-11-27 06:16:39.743969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:53ff0020 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.743985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.356 [2024-11-27 06:16:39.744102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0101ffff cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.744120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.356 [2024-11-27 06:16:39.744232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.744250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.356 #36 NEW cov: 11839 ft: 14744 corp: 22/546b lim: 45 exec/s: 36 rss: 69Mb L: 42/42 MS: 1 PersAutoDict- DE: "\001\000\177\024\374\000 S"- 00:07:10.356 [2024-11-27 06:16:39.794381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.794411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.356 [2024-11-27 06:16:39.794539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.794557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.356 [2024-11-27 06:16:39.794669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0101ffff cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.794685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.356 [2024-11-27 06:16:39.794782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.794798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.356 #37 NEW cov: 11839 ft: 14778 corp: 23/590b lim: 45 exec/s: 37 rss: 69Mb L: 44/44 MS: 1 CopyPart- 00:07:10.356 [2024-11-27 06:16:39.833967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.833994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.356 [2024-11-27 06:16:39.834122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:fffffffe cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.356 [2024-11-27 06:16:39.834142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.356 #38 NEW cov: 11839 ft: 14802 corp: 24/615b lim: 45 exec/s: 38 rss: 69Mb L: 25/44 MS: 1 ChangeBit- 00:07:10.616 [2024-11-27 06:16:39.894737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fffdffff cdw11:01000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.616 [2024-11-27 06:16:39.894769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.616 [2024-11-27 06:16:39.894885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:53ff0020 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.616 [2024-11-27 06:16:39.894903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.616 [2024-11-27 06:16:39.895027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0101ffff cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.616 [2024-11-27 06:16:39.895044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.616 [2024-11-27 06:16:39.895168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.616 [2024-11-27 06:16:39.895186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.616 #39 NEW cov: 11839 ft: 14859 corp: 25/657b lim: 45 exec/s: 39 rss: 69Mb L: 42/44 MS: 1 ChangeBit- 00:07:10.616 [2024-11-27 06:16:39.944115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.616 [2024-11-27 06:16:39.944144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.616 [2024-11-27 06:16:39.944265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.616 [2024-11-27 06:16:39.944282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.616 #40 NEW cov: 11839 ft: 14885 corp: 26/676b lim: 45 exec/s: 40 rss: 69Mb L: 19/44 MS: 1 CopyPart- 00:07:10.616 [2024-11-27 06:16:39.994286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.616 [2024-11-27 06:16:39.994315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.616 [2024-11-27 06:16:39.994442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:fffffffe cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.616 [2024-11-27 06:16:39.994459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.616 [2024-11-27 06:16:39.994582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:24a92d80 cdw11:f78a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.616 [2024-11-27 06:16:39.994603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.617 #41 NEW cov: 11839 ft: 14907 corp: 27/709b lim: 45 exec/s: 41 rss: 69Mb L: 33/44 MS: 1 PersAutoDict- DE: "\000\222-\200$\251\367\212"- 00:07:10.617 [2024-11-27 06:16:40.044778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff24ffff cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.617 [2024-11-27 06:16:40.044808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.617 [2024-11-27 06:16:40.044933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.617 [2024-11-27 06:16:40.044953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.617 [2024-11-27 06:16:40.045073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:14fcff7f cdw11:00200002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.617 [2024-11-27 06:16:40.045090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.617 [2024-11-27 06:16:40.045199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:922dff00 cdw11:80240005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.617 [2024-11-27 06:16:40.045218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.617 #42 NEW cov: 11839 ft: 14946 corp: 28/747b lim: 45 exec/s: 42 rss: 70Mb L: 38/44 MS: 1 InsertByte- 00:07:10.617 [2024-11-27 06:16:40.095022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.617 [2024-11-27 06:16:40.095052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.617 [2024-11-27 06:16:40.095171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.617 [2024-11-27 06:16:40.095187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.617 [2024-11-27 06:16:40.095307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:002014fc cdw11:53ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.617 [2024-11-27 06:16:40.095324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.617 #43 NEW cov: 11839 ft: 14956 corp: 29/775b lim: 45 exec/s: 43 rss: 70Mb L: 28/44 MS: 1 CopyPart- 00:07:10.617 [2024-11-27 06:16:40.135389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.617 [2024-11-27 06:16:40.135417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.617 [2024-11-27 06:16:40.135543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.617 [2024-11-27 06:16:40.135561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.617 [2024-11-27 06:16:40.135678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0101ffff cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.617 [2024-11-27 06:16:40.135694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.617 [2024-11-27 06:16:40.135804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.617 [2024-11-27 06:16:40.135822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.876 #44 NEW cov: 11839 ft: 14961 corp: 30/817b lim: 45 exec/s: 44 rss: 70Mb L: 42/44 MS: 1 ShuffleBytes- 00:07:10.877 [2024-11-27 06:16:40.175296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fff8ffff cdw11:ff000004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.175325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.877 [2024-11-27 06:16:40.175451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:f78a24a9 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.175470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.877 [2024-11-27 06:16:40.175595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff01ffff cdw11:007f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.175617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.877 #45 NEW cov: 11839 ft: 14977 corp: 31/848b lim: 45 exec/s: 45 rss: 70Mb L: 31/44 MS: 1 PersAutoDict- DE: "\000\222-\200$\251\367\212"- 00:07:10.877 [2024-11-27 06:16:40.225965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fffdffff cdw11:01000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.225995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.877 [2024-11-27 06:16:40.226111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:53ff0020 cdw11:ff400007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.226129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.877 [2024-11-27 06:16:40.226247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.226264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.877 [2024-11-27 06:16:40.226384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.226400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.877 [2024-11-27 06:16:40.226511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:7f140100 cdw11:fc000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.226527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:10.877 #46 NEW cov: 11839 ft: 15043 corp: 32/893b lim: 45 exec/s: 46 rss: 70Mb L: 45/45 MS: 1 CrossOver- 00:07:10.877 [2024-11-27 06:16:40.275774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.275802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.877 [2024-11-27 06:16:40.275917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.275934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.877 [2024-11-27 06:16:40.276055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.276073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.877 [2024-11-27 06:16:40.276217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.276234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.877 #47 NEW cov: 11839 ft: 15046 corp: 33/937b lim: 45 exec/s: 47 rss: 70Mb L: 44/45 MS: 1 CopyPart- 00:07:10.877 [2024-11-27 06:16:40.325407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.325436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.877 [2024-11-27 06:16:40.325551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffff53 cdw11:20ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.325568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.877 #48 NEW cov: 11839 ft: 15069 corp: 34/956b lim: 45 exec/s: 48 rss: 70Mb L: 19/45 MS: 1 ShuffleBytes- 00:07:10.877 [2024-11-27 06:16:40.364891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.364919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.877 #49 NEW cov: 11839 ft: 15085 corp: 35/973b lim: 45 exec/s: 49 rss: 70Mb L: 17/45 MS: 1 EraseBytes- 00:07:10.877 [2024-11-27 06:16:40.405176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.405205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.877 [2024-11-27 06:16:40.405332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:14fc007f cdw11:00200002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.877 [2024-11-27 06:16:40.405349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.137 #50 NEW cov: 11839 ft: 15095 corp: 36/994b lim: 45 exec/s: 50 rss: 70Mb L: 21/45 MS: 1 CMP- DE: "\377\377\377\377"- 00:07:11.137 [2024-11-27 06:16:40.455733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:fff70007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.137 [2024-11-27 06:16:40.455760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.137 [2024-11-27 06:16:40.455878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:14fc007f cdw11:00200002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.137 [2024-11-27 06:16:40.455894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.137 #51 NEW cov: 11839 ft: 15102 corp: 37/1015b lim: 45 exec/s: 51 rss: 70Mb L: 21/45 MS: 1 ChangeBit- 00:07:11.137 [2024-11-27 06:16:40.505602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.137 [2024-11-27 06:16:40.505632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.137 [2024-11-27 06:16:40.505752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.137 [2024-11-27 06:16:40.505768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.137 #52 NEW cov: 11839 ft: 15170 corp: 38/1041b lim: 45 exec/s: 52 rss: 70Mb L: 26/45 MS: 1 InsertByte- 00:07:11.137 [2024-11-27 06:16:40.546211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.137 [2024-11-27 06:16:40.546241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.137 [2024-11-27 06:16:40.546354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.137 [2024-11-27 06:16:40.546376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.137 [2024-11-27 06:16:40.546502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.137 [2024-11-27 06:16:40.546521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.137 #53 NEW cov: 11839 ft: 15175 corp: 39/1074b lim: 45 exec/s: 26 rss: 70Mb L: 33/45 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:11.137 #53 DONE cov: 11839 ft: 15175 corp: 39/1074b lim: 45 exec/s: 26 rss: 70Mb 00:07:11.137 ###### Recommended dictionary. ###### 00:07:11.137 "\001\000\177\024\374\000 S" # Uses: 6 00:07:11.137 "\377\377\001\000" # Uses: 0 00:07:11.137 "\000\222-\200$\251\367\212" # Uses: 2 00:07:11.137 "\377\377\377\377" # Uses: 1 00:07:11.137 ###### End of recommended dictionary. ###### 00:07:11.137 Done 53 runs in 2 second(s) 00:07:11.397 06:16:40 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:11.397 06:16:40 -- ../common.sh@72 -- # (( i++ )) 00:07:11.397 06:16:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:11.397 06:16:40 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:11.397 06:16:40 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:11.397 06:16:40 -- nvmf/run.sh@24 -- # local timen=1 00:07:11.397 06:16:40 -- nvmf/run.sh@25 -- # local core=0x1 00:07:11.397 06:16:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:11.397 06:16:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:11.397 06:16:40 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:11.397 06:16:40 -- nvmf/run.sh@29 -- # port=4406 00:07:11.397 06:16:40 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:11.397 06:16:40 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:11.397 06:16:40 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:11.397 06:16:40 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:11.397 [2024-11-27 06:16:40.720061] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:11.397 [2024-11-27 06:16:40.720112] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid31618 ] 00:07:11.397 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.397 [2024-11-27 06:16:40.893412] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.657 [2024-11-27 06:16:40.958460] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:11.657 [2024-11-27 06:16:40.958582] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.657 [2024-11-27 06:16:41.016522] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:11.657 [2024-11-27 06:16:41.032870] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:11.657 INFO: Running with entropic power schedule (0xFF, 100). 00:07:11.657 INFO: Seed: 3912240430 00:07:11.657 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:11.657 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:11.657 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:11.657 INFO: A corpus is not provided, starting from an empty corpus 00:07:11.657 #2 INITED exec/s: 0 rss: 60Mb 00:07:11.657 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:11.657 This may also happen if the target rejected all inputs we tried so far 00:07:11.657 [2024-11-27 06:16:41.082036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000027f cdw11:00000000 00:07:11.657 [2024-11-27 06:16:41.082063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.657 [2024-11-27 06:16:41.082113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:11.657 [2024-11-27 06:16:41.082126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.917 NEW_FUNC[1/669]: 0x445308 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:11.917 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:11.917 #4 NEW cov: 11529 ft: 11530 corp: 2/5b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:11.917 [2024-11-27 06:16:41.382772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a25 cdw11:00000000 00:07:11.917 [2024-11-27 06:16:41.382803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.917 #5 NEW cov: 11642 ft: 12194 corp: 3/7b lim: 10 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 InsertByte- 00:07:11.917 [2024-11-27 06:16:41.422823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b70a cdw11:00000000 00:07:11.917 [2024-11-27 06:16:41.422849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.917 #9 NEW cov: 11648 ft: 12491 corp: 4/10b lim: 10 exec/s: 0 rss: 68Mb L: 3/4 MS: 4 ShuffleBytes-ChangeBit-ChangeByte-CrossOver- 00:07:12.177 [2024-11-27 06:16:41.452980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7f cdw11:00000000 00:07:12.177 [2024-11-27 06:16:41.453006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.177 [2024-11-27 06:16:41.453077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:12.177 [2024-11-27 06:16:41.453090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.177 #10 NEW cov: 11733 ft: 12783 corp: 5/14b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 ChangeBit- 00:07:12.177 [2024-11-27 06:16:41.493161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7f cdw11:00000000 00:07:12.177 [2024-11-27 06:16:41.493186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.177 [2024-11-27 06:16:41.493257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f5f cdw11:00000000 00:07:12.177 [2024-11-27 06:16:41.493270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.177 #11 NEW cov: 11733 ft: 12816 corp: 6/18b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 ChangeBit- 00:07:12.177 [2024-11-27 06:16:41.533106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000abc cdw11:00000000 00:07:12.177 [2024-11-27 06:16:41.533132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.177 #13 NEW cov: 11733 ft: 12871 corp: 7/20b lim: 10 exec/s: 0 rss: 68Mb L: 2/4 MS: 2 ShuffleBytes-InsertByte- 00:07:12.177 [2024-11-27 06:16:41.563356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000427f cdw11:00000000 00:07:12.177 [2024-11-27 06:16:41.563382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.177 [2024-11-27 06:16:41.563443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:12.177 [2024-11-27 06:16:41.563472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.177 #14 NEW cov: 11733 ft: 12917 corp: 8/24b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 ChangeBit- 00:07:12.177 [2024-11-27 06:16:41.603488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004225 cdw11:00000000 00:07:12.177 [2024-11-27 06:16:41.603512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.177 [2024-11-27 06:16:41.603568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:12.177 [2024-11-27 06:16:41.603581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.177 #15 NEW cov: 11733 ft: 13077 corp: 9/29b lim: 10 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 CrossOver- 00:07:12.177 [2024-11-27 06:16:41.643503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e25 cdw11:00000000 00:07:12.177 [2024-11-27 06:16:41.643529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.177 #16 NEW cov: 11733 ft: 13114 corp: 10/31b lim: 10 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ChangeByte- 00:07:12.178 [2024-11-27 06:16:41.683775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004225 cdw11:00000000 00:07:12.178 [2024-11-27 06:16:41.683801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.178 [2024-11-27 06:16:41.683857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:12.178 [2024-11-27 06:16:41.683871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.178 #17 NEW cov: 11733 ft: 13135 corp: 11/36b lim: 10 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 ShuffleBytes- 00:07:12.437 [2024-11-27 06:16:41.723693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e25 cdw11:00000000 00:07:12.437 [2024-11-27 06:16:41.723719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.437 #18 NEW cov: 11733 ft: 13178 corp: 12/39b lim: 10 exec/s: 0 rss: 69Mb L: 3/5 MS: 1 InsertByte- 00:07:12.437 [2024-11-27 06:16:41.764061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000427f cdw11:00000000 00:07:12.437 [2024-11-27 06:16:41.764086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.437 [2024-11-27 06:16:41.764142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:12.437 [2024-11-27 06:16:41.764156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.437 [2024-11-27 06:16:41.764210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000427f cdw11:00000000 00:07:12.437 [2024-11-27 06:16:41.764223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.437 #19 NEW cov: 11733 ft: 13349 corp: 13/46b lim: 10 exec/s: 0 rss: 69Mb L: 7/7 MS: 1 CopyPart- 00:07:12.437 [2024-11-27 06:16:41.803937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002b98 cdw11:00000000 00:07:12.437 [2024-11-27 06:16:41.803962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.437 #24 NEW cov: 11733 ft: 13371 corp: 14/48b lim: 10 exec/s: 0 rss: 69Mb L: 2/7 MS: 5 EraseBytes-ChangeBit-ShuffleBytes-ChangeBit-InsertByte- 00:07:12.437 [2024-11-27 06:16:41.844176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e25 cdw11:00000000 00:07:12.437 [2024-11-27 06:16:41.844201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.437 [2024-11-27 06:16:41.844271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ace4 cdw11:00000000 00:07:12.437 [2024-11-27 06:16:41.844285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.437 #25 NEW cov: 11733 ft: 13390 corp: 15/52b lim: 10 exec/s: 0 rss: 69Mb L: 4/7 MS: 1 InsertByte- 00:07:12.437 [2024-11-27 06:16:41.884304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a7f cdw11:00000000 00:07:12.437 [2024-11-27 06:16:41.884330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.437 [2024-11-27 06:16:41.884384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:12.437 [2024-11-27 06:16:41.884398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.437 #26 NEW cov: 11733 ft: 13411 corp: 16/56b lim: 10 exec/s: 0 rss: 69Mb L: 4/7 MS: 1 ChangeBit- 00:07:12.437 [2024-11-27 06:16:41.924388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000527f cdw11:00000000 00:07:12.437 [2024-11-27 06:16:41.924413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.438 [2024-11-27 06:16:41.924468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:12.438 [2024-11-27 06:16:41.924482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.438 #27 NEW cov: 11733 ft: 13536 corp: 17/60b lim: 10 exec/s: 0 rss: 69Mb L: 4/7 MS: 1 ChangeBit- 00:07:12.438 [2024-11-27 06:16:41.964548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7f cdw11:00000000 00:07:12.438 [2024-11-27 06:16:41.964573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.438 [2024-11-27 06:16:41.964632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a5f cdw11:00000000 00:07:12.438 [2024-11-27 06:16:41.964646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.697 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:12.697 #28 NEW cov: 11756 ft: 13589 corp: 18/64b lim: 10 exec/s: 0 rss: 69Mb L: 4/7 MS: 1 ChangeByte- 00:07:12.697 [2024-11-27 06:16:42.004524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b70a cdw11:00000000 00:07:12.697 [2024-11-27 06:16:42.004549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.697 #29 NEW cov: 11756 ft: 13608 corp: 19/67b lim: 10 exec/s: 0 rss: 69Mb L: 3/7 MS: 1 ShuffleBytes- 00:07:12.697 [2024-11-27 06:16:42.044810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000027f cdw11:00000000 00:07:12.697 [2024-11-27 06:16:42.044834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.697 [2024-11-27 06:16:42.044890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f6f cdw11:00000000 00:07:12.697 [2024-11-27 06:16:42.044904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.697 #30 NEW cov: 11756 ft: 13617 corp: 20/71b lim: 10 exec/s: 0 rss: 69Mb L: 4/7 MS: 1 ChangeBit- 00:07:12.697 [2024-11-27 06:16:42.084900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:12.697 [2024-11-27 06:16:42.084924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.697 [2024-11-27 06:16:42.084979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f5f cdw11:00000000 00:07:12.697 [2024-11-27 06:16:42.084992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.697 #31 NEW cov: 11756 ft: 13664 corp: 21/75b lim: 10 exec/s: 31 rss: 69Mb L: 4/7 MS: 1 ShuffleBytes- 00:07:12.697 [2024-11-27 06:16:42.124866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000027f cdw11:00000000 00:07:12.697 [2024-11-27 06:16:42.124891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.697 #32 NEW cov: 11756 ft: 13676 corp: 22/77b lim: 10 exec/s: 32 rss: 69Mb L: 2/7 MS: 1 CrossOver- 00:07:12.697 [2024-11-27 06:16:42.164997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002b98 cdw11:00000000 00:07:12.697 [2024-11-27 06:16:42.165022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.697 #33 NEW cov: 11756 ft: 13720 corp: 23/79b lim: 10 exec/s: 33 rss: 69Mb L: 2/7 MS: 1 ShuffleBytes- 00:07:12.697 [2024-11-27 06:16:42.205229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a0e cdw11:00000000 00:07:12.697 [2024-11-27 06:16:42.205254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.697 [2024-11-27 06:16:42.205334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f5f cdw11:00000000 00:07:12.697 [2024-11-27 06:16:42.205348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.697 #34 NEW cov: 11756 ft: 13747 corp: 24/83b lim: 10 exec/s: 34 rss: 69Mb L: 4/7 MS: 1 ChangeBit- 00:07:12.957 [2024-11-27 06:16:42.245218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000027e cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.245243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.957 #35 NEW cov: 11756 ft: 13754 corp: 25/85b lim: 10 exec/s: 35 rss: 70Mb L: 2/7 MS: 1 ChangeBit- 00:07:12.957 [2024-11-27 06:16:42.285593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007f7a cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.285621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.957 [2024-11-27 06:16:42.285676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005f0a cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.285689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.957 [2024-11-27 06:16:42.285743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007f7a cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.285757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.957 #36 NEW cov: 11756 ft: 13814 corp: 26/92b lim: 10 exec/s: 36 rss: 70Mb L: 7/7 MS: 1 CopyPart- 00:07:12.957 [2024-11-27 06:16:42.325729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a42 cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.325753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.957 [2024-11-27 06:16:42.325810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f25 cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.325826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.957 [2024-11-27 06:16:42.325878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.325891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.957 #37 NEW cov: 11756 ft: 13822 corp: 27/99b lim: 10 exec/s: 37 rss: 70Mb L: 7/7 MS: 1 CrossOver- 00:07:12.957 [2024-11-27 06:16:42.365686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000427f cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.365711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.957 [2024-11-27 06:16:42.365783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.365797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.957 #38 NEW cov: 11756 ft: 13837 corp: 28/103b lim: 10 exec/s: 38 rss: 70Mb L: 4/7 MS: 1 EraseBytes- 00:07:12.957 [2024-11-27 06:16:42.406205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000427f cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.406230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.957 [2024-11-27 06:16:42.406285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.406298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.957 [2024-11-27 06:16:42.406352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000427f cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.406365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.957 [2024-11-27 06:16:42.406418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007f42 cdw11:00000000 00:07:12.957 [2024-11-27 06:16:42.406431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.957 [2024-11-27 06:16:42.406484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:12.958 [2024-11-27 06:16:42.406497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:12.958 #39 NEW cov: 11756 ft: 14078 corp: 29/113b lim: 10 exec/s: 39 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:07:12.958 [2024-11-27 06:16:42.445942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000027f cdw11:00000000 00:07:12.958 [2024-11-27 06:16:42.445966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.958 [2024-11-27 06:16:42.446019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f6f cdw11:00000000 00:07:12.958 [2024-11-27 06:16:42.446032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.958 #40 NEW cov: 11756 ft: 14092 corp: 30/118b lim: 10 exec/s: 40 rss: 70Mb L: 5/10 MS: 1 InsertByte- 00:07:12.958 [2024-11-27 06:16:42.486161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000bc7f cdw11:00000000 00:07:12.958 [2024-11-27 06:16:42.486186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.958 [2024-11-27 06:16:42.486258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:12.958 [2024-11-27 06:16:42.486275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.958 [2024-11-27 06:16:42.486330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000427f cdw11:00000000 00:07:12.958 [2024-11-27 06:16:42.486344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.217 #41 NEW cov: 11756 ft: 14105 corp: 31/125b lim: 10 exec/s: 41 rss: 70Mb L: 7/10 MS: 1 ChangeBinInt- 00:07:13.217 [2024-11-27 06:16:42.525991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000427f cdw11:00000000 00:07:13.217 [2024-11-27 06:16:42.526015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.217 #42 NEW cov: 11756 ft: 14165 corp: 32/128b lim: 10 exec/s: 42 rss: 70Mb L: 3/10 MS: 1 EraseBytes- 00:07:13.217 [2024-11-27 06:16:42.566307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003db7 cdw11:00000000 00:07:13.217 [2024-11-27 06:16:42.566331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.217 [2024-11-27 06:16:42.566401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a25 cdw11:00000000 00:07:13.217 [2024-11-27 06:16:42.566415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.217 #44 NEW cov: 11756 ft: 14175 corp: 33/132b lim: 10 exec/s: 44 rss: 70Mb L: 4/10 MS: 2 ChangeByte-CrossOver- 00:07:13.217 [2024-11-27 06:16:42.596365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a3a3 cdw11:00000000 00:07:13.217 [2024-11-27 06:16:42.596390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.217 [2024-11-27 06:16:42.596446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000a32e cdw11:00000000 00:07:13.217 [2024-11-27 06:16:42.596459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.217 #45 NEW cov: 11756 ft: 14179 corp: 34/137b lim: 10 exec/s: 45 rss: 70Mb L: 5/10 MS: 1 InsertRepeatedBytes- 00:07:13.218 [2024-11-27 06:16:42.636726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7f cdw11:00000000 00:07:13.218 [2024-11-27 06:16:42.636750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.218 [2024-11-27 06:16:42.636807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a7f cdw11:00000000 00:07:13.218 [2024-11-27 06:16:42.636820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.218 [2024-11-27 06:16:42.636874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a5f cdw11:00000000 00:07:13.218 [2024-11-27 06:16:42.636887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.218 [2024-11-27 06:16:42.636942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a5f cdw11:00000000 00:07:13.218 [2024-11-27 06:16:42.636954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.218 #46 NEW cov: 11756 ft: 14203 corp: 35/145b lim: 10 exec/s: 46 rss: 70Mb L: 8/10 MS: 1 CopyPart- 00:07:13.218 [2024-11-27 06:16:42.676440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002698 cdw11:00000000 00:07:13.218 [2024-11-27 06:16:42.676464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.218 #47 NEW cov: 11756 ft: 14224 corp: 36/147b lim: 10 exec/s: 47 rss: 70Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:13.218 [2024-11-27 06:16:42.716937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.218 [2024-11-27 06:16:42.716962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.218 [2024-11-27 06:16:42.717017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.218 [2024-11-27 06:16:42.717030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.218 [2024-11-27 06:16:42.717085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.218 [2024-11-27 06:16:42.717098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.218 [2024-11-27 06:16:42.717152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 00:07:13.218 [2024-11-27 06:16:42.717165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.218 #48 NEW cov: 11756 ft: 14309 corp: 37/156b lim: 10 exec/s: 48 rss: 70Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:13.477 [2024-11-27 06:16:42.756939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:13.477 [2024-11-27 06:16:42.756965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.477 [2024-11-27 06:16:42.757022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f5f cdw11:00000000 00:07:13.477 [2024-11-27 06:16:42.757036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.477 [2024-11-27 06:16:42.757093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007171 cdw11:00000000 00:07:13.477 [2024-11-27 06:16:42.757106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.477 #49 NEW cov: 11756 ft: 14322 corp: 38/163b lim: 10 exec/s: 49 rss: 70Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:07:13.477 [2024-11-27 06:16:42.797168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000427f cdw11:00000000 00:07:13.477 [2024-11-27 06:16:42.797194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.477 [2024-11-27 06:16:42.797250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f42 cdw11:00000000 00:07:13.477 [2024-11-27 06:16:42.797263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.477 [2024-11-27 06:16:42.797318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000257f cdw11:00000000 00:07:13.477 [2024-11-27 06:16:42.797331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.477 [2024-11-27 06:16:42.797384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:13.477 [2024-11-27 06:16:42.797397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.477 #50 NEW cov: 11756 ft: 14381 corp: 39/171b lim: 10 exec/s: 50 rss: 70Mb L: 8/10 MS: 1 CrossOver- 00:07:13.477 [2024-11-27 06:16:42.836994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000527f cdw11:00000000 00:07:13.477 [2024-11-27 06:16:42.837021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.477 [2024-11-27 06:16:42.837079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000767f cdw11:00000000 00:07:13.477 [2024-11-27 06:16:42.837093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.477 #51 NEW cov: 11756 ft: 14401 corp: 40/175b lim: 10 exec/s: 51 rss: 70Mb L: 4/10 MS: 1 ChangeBinInt- 00:07:13.477 [2024-11-27 06:16:42.877003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001a25 cdw11:00000000 00:07:13.478 [2024-11-27 06:16:42.877029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.478 #52 NEW cov: 11756 ft: 14419 corp: 41/177b lim: 10 exec/s: 52 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:13.478 [2024-11-27 06:16:42.917242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004225 cdw11:00000000 00:07:13.478 [2024-11-27 06:16:42.917267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.478 [2024-11-27 06:16:42.917324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:13.478 [2024-11-27 06:16:42.917338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.478 #53 NEW cov: 11756 ft: 14429 corp: 42/182b lim: 10 exec/s: 53 rss: 70Mb L: 5/10 MS: 1 ChangeByte- 00:07:13.478 [2024-11-27 06:16:42.957358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7f cdw11:00000000 00:07:13.478 [2024-11-27 06:16:42.957383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.478 [2024-11-27 06:16:42.957441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a40 cdw11:00000000 00:07:13.478 [2024-11-27 06:16:42.957454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.478 #54 NEW cov: 11756 ft: 14475 corp: 43/187b lim: 10 exec/s: 54 rss: 70Mb L: 5/10 MS: 1 InsertByte- 00:07:13.478 [2024-11-27 06:16:42.987309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000027e cdw11:00000000 00:07:13.478 [2024-11-27 06:16:42.987334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.478 #55 NEW cov: 11756 ft: 14484 corp: 44/190b lim: 10 exec/s: 55 rss: 70Mb L: 3/10 MS: 1 InsertByte- 00:07:13.737 [2024-11-27 06:16:43.027627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffa3 cdw11:00000000 00:07:13.737 [2024-11-27 06:16:43.027653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.737 [2024-11-27 06:16:43.027725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000a32e cdw11:00000000 00:07:13.737 [2024-11-27 06:16:43.027740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.737 #56 NEW cov: 11756 ft: 14504 corp: 45/195b lim: 10 exec/s: 56 rss: 70Mb L: 5/10 MS: 1 ChangeByte- 00:07:13.737 [2024-11-27 06:16:43.067734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2b cdw11:00000000 00:07:13.737 [2024-11-27 06:16:43.067759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.737 [2024-11-27 06:16:43.067816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000987f cdw11:00000000 00:07:13.737 [2024-11-27 06:16:43.067829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.737 #57 NEW cov: 11756 ft: 14510 corp: 46/200b lim: 10 exec/s: 28 rss: 70Mb L: 5/10 MS: 1 CrossOver- 00:07:13.737 #57 DONE cov: 11756 ft: 14510 corp: 46/200b lim: 10 exec/s: 28 rss: 70Mb 00:07:13.737 Done 57 runs in 2 second(s) 00:07:13.737 06:16:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:13.737 06:16:43 -- ../common.sh@72 -- # (( i++ )) 00:07:13.737 06:16:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:13.737 06:16:43 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:13.737 06:16:43 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:13.737 06:16:43 -- nvmf/run.sh@24 -- # local timen=1 00:07:13.737 06:16:43 -- nvmf/run.sh@25 -- # local core=0x1 00:07:13.737 06:16:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:13.737 06:16:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:13.737 06:16:43 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:13.737 06:16:43 -- nvmf/run.sh@29 -- # port=4407 00:07:13.737 06:16:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:13.737 06:16:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:13.738 06:16:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:13.738 06:16:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:13.738 [2024-11-27 06:16:43.242512] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:13.738 [2024-11-27 06:16:43.242584] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid32085 ] 00:07:13.997 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.997 [2024-11-27 06:16:43.418584] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.997 [2024-11-27 06:16:43.482055] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:13.997 [2024-11-27 06:16:43.482198] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.257 [2024-11-27 06:16:43.540185] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:14.257 [2024-11-27 06:16:43.556579] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:14.257 INFO: Running with entropic power schedule (0xFF, 100). 00:07:14.257 INFO: Seed: 2142286861 00:07:14.257 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:14.257 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:14.257 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:14.257 INFO: A corpus is not provided, starting from an empty corpus 00:07:14.257 #2 INITED exec/s: 0 rss: 61Mb 00:07:14.257 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:14.257 This may also happen if the target rejected all inputs we tried so far 00:07:14.257 [2024-11-27 06:16:43.611781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:14.257 [2024-11-27 06:16:43.611811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.517 NEW_FUNC[1/669]: 0x445d08 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:14.517 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:14.517 #10 NEW cov: 11529 ft: 11515 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 3 ChangeByte-ShuffleBytes-CrossOver- 00:07:14.517 [2024-11-27 06:16:43.912627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:14.517 [2024-11-27 06:16:43.912662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.517 [2024-11-27 06:16:43.912715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:14.517 [2024-11-27 06:16:43.912728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.517 #11 NEW cov: 11642 ft: 12178 corp: 3/8b lim: 10 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:14.517 [2024-11-27 06:16:43.962709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:14.517 [2024-11-27 06:16:43.962736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.517 [2024-11-27 06:16:43.962790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:14.517 [2024-11-27 06:16:43.962804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.517 #12 NEW cov: 11648 ft: 12457 corp: 4/13b lim: 10 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 ShuffleBytes- 00:07:14.517 [2024-11-27 06:16:44.002821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:14.517 [2024-11-27 06:16:44.002847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.517 [2024-11-27 06:16:44.002901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000aff cdw11:00000000 00:07:14.517 [2024-11-27 06:16:44.002915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.517 #13 NEW cov: 11733 ft: 12829 corp: 5/18b lim: 10 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 CrossOver- 00:07:14.517 [2024-11-27 06:16:44.042877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:14.517 [2024-11-27 06:16:44.042903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.517 [2024-11-27 06:16:44.042972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:14.517 [2024-11-27 06:16:44.042986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.777 #14 NEW cov: 11733 ft: 12864 corp: 6/23b lim: 10 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 CrossOver- 00:07:14.777 [2024-11-27 06:16:44.083367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:14.777 [2024-11-27 06:16:44.083392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.777 [2024-11-27 06:16:44.083463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:14.777 [2024-11-27 06:16:44.083477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.777 [2024-11-27 06:16:44.083531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:14.777 [2024-11-27 06:16:44.083544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.777 [2024-11-27 06:16:44.083594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:14.777 [2024-11-27 06:16:44.083611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.777 [2024-11-27 06:16:44.083663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:14.777 [2024-11-27 06:16:44.083678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:14.777 #15 NEW cov: 11733 ft: 13202 corp: 7/33b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:14.777 [2024-11-27 06:16:44.122988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aae cdw11:00000000 00:07:14.777 [2024-11-27 06:16:44.123013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.777 #16 NEW cov: 11733 ft: 13318 corp: 8/35b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:07:14.777 [2024-11-27 06:16:44.163278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:14.777 [2024-11-27 06:16:44.163303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.777 [2024-11-27 06:16:44.163375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff1e cdw11:00000000 00:07:14.777 [2024-11-27 06:16:44.163388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.777 #17 NEW cov: 11733 ft: 13375 corp: 9/40b lim: 10 exec/s: 0 rss: 69Mb L: 5/10 MS: 1 CMP- DE: "\377\036"- 00:07:14.777 [2024-11-27 06:16:44.203392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000eff cdw11:00000000 00:07:14.777 [2024-11-27 06:16:44.203417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.777 [2024-11-27 06:16:44.203486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:14.777 [2024-11-27 06:16:44.203499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.777 #23 NEW cov: 11733 ft: 13456 corp: 10/45b lim: 10 exec/s: 0 rss: 69Mb L: 5/10 MS: 1 ChangeBit- 00:07:14.777 [2024-11-27 06:16:44.243395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aac cdw11:00000000 00:07:14.777 [2024-11-27 06:16:44.243419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.777 #24 NEW cov: 11733 ft: 13509 corp: 11/47b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:07:14.778 [2024-11-27 06:16:44.283978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:14.778 [2024-11-27 06:16:44.284003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.778 [2024-11-27 06:16:44.284072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:14.778 [2024-11-27 06:16:44.284085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.778 [2024-11-27 06:16:44.284139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:14.778 [2024-11-27 06:16:44.284152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.778 [2024-11-27 06:16:44.284207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:14.778 [2024-11-27 06:16:44.284220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.778 [2024-11-27 06:16:44.284274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:14.778 [2024-11-27 06:16:44.284287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:14.778 #25 NEW cov: 11733 ft: 13623 corp: 12/57b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:15.037 [2024-11-27 06:16:44.323730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000eff cdw11:00000000 00:07:15.037 [2024-11-27 06:16:44.323756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.037 [2024-11-27 06:16:44.323825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.037 [2024-11-27 06:16:44.323839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.037 #26 NEW cov: 11733 ft: 13635 corp: 13/62b lim: 10 exec/s: 0 rss: 69Mb L: 5/10 MS: 1 CopyPart- 00:07:15.037 [2024-11-27 06:16:44.363935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:15.037 [2024-11-27 06:16:44.363960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.037 [2024-11-27 06:16:44.364014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff4b cdw11:00000000 00:07:15.037 [2024-11-27 06:16:44.364028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.037 [2024-11-27 06:16:44.364080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001eff cdw11:00000000 00:07:15.037 [2024-11-27 06:16:44.364094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.037 #27 NEW cov: 11733 ft: 13792 corp: 14/68b lim: 10 exec/s: 0 rss: 69Mb L: 6/10 MS: 1 InsertByte- 00:07:15.037 [2024-11-27 06:16:44.403822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001aac cdw11:00000000 00:07:15.037 [2024-11-27 06:16:44.403847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.037 #28 NEW cov: 11733 ft: 13805 corp: 15/70b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:07:15.038 [2024-11-27 06:16:44.444077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000ac4 cdw11:00000000 00:07:15.038 [2024-11-27 06:16:44.444103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.038 [2024-11-27 06:16:44.444156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.038 [2024-11-27 06:16:44.444170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.038 #29 NEW cov: 11733 ft: 13825 corp: 16/75b lim: 10 exec/s: 0 rss: 69Mb L: 5/10 MS: 1 ChangeByte- 00:07:15.038 [2024-11-27 06:16:44.484158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aae cdw11:00000000 00:07:15.038 [2024-11-27 06:16:44.484183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.038 [2024-11-27 06:16:44.484238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000aae cdw11:00000000 00:07:15.038 [2024-11-27 06:16:44.484251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.038 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:15.038 #30 NEW cov: 11756 ft: 13869 corp: 17/79b lim: 10 exec/s: 0 rss: 69Mb L: 4/10 MS: 1 CopyPart- 00:07:15.038 [2024-11-27 06:16:44.524155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff1e cdw11:00000000 00:07:15.038 [2024-11-27 06:16:44.524181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.038 #31 NEW cov: 11756 ft: 13882 corp: 18/81b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 PersAutoDict- DE: "\377\036"- 00:07:15.038 [2024-11-27 06:16:44.554480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:15.038 [2024-11-27 06:16:44.554505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.038 [2024-11-27 06:16:44.554573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.038 [2024-11-27 06:16:44.554587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.038 [2024-11-27 06:16:44.554649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff0e cdw11:00000000 00:07:15.038 [2024-11-27 06:16:44.554663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.298 #32 NEW cov: 11756 ft: 13954 corp: 19/88b lim: 10 exec/s: 0 rss: 69Mb L: 7/10 MS: 1 CrossOver- 00:07:15.298 [2024-11-27 06:16:44.594372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.594397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.298 #33 NEW cov: 11756 ft: 13965 corp: 20/91b lim: 10 exec/s: 33 rss: 69Mb L: 3/10 MS: 1 PersAutoDict- DE: "\377\036"- 00:07:15.298 [2024-11-27 06:16:44.634608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000af7 cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.634633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.298 [2024-11-27 06:16:44.634686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.634699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.298 #34 NEW cov: 11756 ft: 14093 corp: 21/96b lim: 10 exec/s: 34 rss: 69Mb L: 5/10 MS: 1 ChangeBit- 00:07:15.298 [2024-11-27 06:16:44.674763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffae cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.674789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.298 [2024-11-27 06:16:44.674841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000aae cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.674855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.298 #35 NEW cov: 11756 ft: 14117 corp: 22/100b lim: 10 exec/s: 35 rss: 69Mb L: 4/10 MS: 1 ChangeByte- 00:07:15.298 [2024-11-27 06:16:44.715131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.715157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.298 [2024-11-27 06:16:44.715212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004b1e cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.715226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.298 [2024-11-27 06:16:44.715279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.715308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.298 [2024-11-27 06:16:44.715361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00004b1e cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.715374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.298 #36 NEW cov: 11756 ft: 14143 corp: 23/109b lim: 10 exec/s: 36 rss: 70Mb L: 9/10 MS: 1 CopyPart- 00:07:15.298 [2024-11-27 06:16:44.754983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a3a cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.755023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.298 [2024-11-27 06:16:44.755079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.755093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.298 #37 NEW cov: 11756 ft: 14153 corp: 24/114b lim: 10 exec/s: 37 rss: 70Mb L: 5/10 MS: 1 ChangeByte- 00:07:15.298 [2024-11-27 06:16:44.795438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.795463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.298 [2024-11-27 06:16:44.795534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008f4b cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.795549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.298 [2024-11-27 06:16:44.795607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001eff cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.795621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.298 [2024-11-27 06:16:44.795676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff4b cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.795701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.298 [2024-11-27 06:16:44.795755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00001eff cdw11:00000000 00:07:15.298 [2024-11-27 06:16:44.795768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.298 #38 NEW cov: 11756 ft: 14166 corp: 25/124b lim: 10 exec/s: 38 rss: 70Mb L: 10/10 MS: 1 InsertByte- 00:07:15.559 [2024-11-27 06:16:44.835268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffae cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.835295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.559 [2024-11-27 06:16:44.835351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a8e cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.835365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.559 #39 NEW cov: 11756 ft: 14189 corp: 26/128b lim: 10 exec/s: 39 rss: 70Mb L: 4/10 MS: 1 ChangeBit- 00:07:15.559 [2024-11-27 06:16:44.875468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.875493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.559 [2024-11-27 06:16:44.875563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff4b cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.875577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.559 [2024-11-27 06:16:44.875631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001efb cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.875645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.559 #40 NEW cov: 11756 ft: 14208 corp: 27/134b lim: 10 exec/s: 40 rss: 70Mb L: 6/10 MS: 1 ChangeBit- 00:07:15.559 [2024-11-27 06:16:44.915272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff1e cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.915298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.559 #41 NEW cov: 11756 ft: 14240 corp: 28/137b lim: 10 exec/s: 41 rss: 70Mb L: 3/10 MS: 1 CopyPart- 00:07:15.559 [2024-11-27 06:16:44.955604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a0a cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.955630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.559 [2024-11-27 06:16:44.955685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff1e cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.955698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.559 #42 NEW cov: 11756 ft: 14262 corp: 29/141b lim: 10 exec/s: 42 rss: 70Mb L: 4/10 MS: 1 InsertByte- 00:07:15.559 [2024-11-27 06:16:44.996042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.996067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.559 [2024-11-27 06:16:44.996122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.996135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.559 [2024-11-27 06:16:44.996187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.996216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.559 [2024-11-27 06:16:44.996268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00003aff cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.996281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.559 [2024-11-27 06:16:44.996334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.559 [2024-11-27 06:16:44.996347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.559 #43 NEW cov: 11756 ft: 14266 corp: 30/151b lim: 10 exec/s: 43 rss: 70Mb L: 10/10 MS: 1 CrossOver- 00:07:15.559 [2024-11-27 06:16:45.035766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.559 [2024-11-27 06:16:45.035791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.559 [2024-11-27 06:16:45.035845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000eff cdw11:00000000 00:07:15.559 [2024-11-27 06:16:45.035858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.559 #44 NEW cov: 11756 ft: 14283 corp: 31/156b lim: 10 exec/s: 44 rss: 70Mb L: 5/10 MS: 1 ShuffleBytes- 00:07:15.559 [2024-11-27 06:16:45.076182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:15.559 [2024-11-27 06:16:45.076208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.559 [2024-11-27 06:16:45.076278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:15.559 [2024-11-27 06:16:45.076294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.559 [2024-11-27 06:16:45.076348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003aff cdw11:00000000 00:07:15.559 [2024-11-27 06:16:45.076361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.559 [2024-11-27 06:16:45.076415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.559 [2024-11-27 06:16:45.076429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.820 #45 NEW cov: 11756 ft: 14302 corp: 32/164b lim: 10 exec/s: 45 rss: 70Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:07:15.820 [2024-11-27 06:16:45.116419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.116444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.116499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.116511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.116566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.116579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.116636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00003aff cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.116649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.116700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.116713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.820 #46 NEW cov: 11756 ft: 14326 corp: 33/174b lim: 10 exec/s: 46 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:07:15.820 [2024-11-27 06:16:45.156409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.156434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.156504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.156518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.156572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001e4b cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.156586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.156644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001efb cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.156658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.820 #47 NEW cov: 11756 ft: 14336 corp: 34/182b lim: 10 exec/s: 47 rss: 70Mb L: 8/10 MS: 1 PersAutoDict- DE: "\377\036"- 00:07:15.820 [2024-11-27 06:16:45.196511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a7f cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.196539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.196596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004b1e cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.196614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.196684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.196698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.196750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00004b1e cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.196763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.820 #48 NEW cov: 11756 ft: 14342 corp: 35/191b lim: 10 exec/s: 48 rss: 70Mb L: 9/10 MS: 1 ChangeBit- 00:07:15.820 [2024-11-27 06:16:45.236755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000aaaa cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.236780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.236834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000aaaa cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.236848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.236900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000aaaa cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.236914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.236966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00008a0a cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.236979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.237033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff1e cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.237046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.820 #49 NEW cov: 11756 ft: 14418 corp: 36/201b lim: 10 exec/s: 49 rss: 70Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:15.820 [2024-11-27 06:16:45.276542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:15.820 [2024-11-27 06:16:45.276566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.820 [2024-11-27 06:16:45.276639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 00:07:15.821 [2024-11-27 06:16:45.276653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.821 #50 NEW cov: 11756 ft: 14434 corp: 37/206b lim: 10 exec/s: 50 rss: 70Mb L: 5/10 MS: 1 ChangeBinInt- 00:07:15.821 [2024-11-27 06:16:45.306971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:15.821 [2024-11-27 06:16:45.306996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.821 [2024-11-27 06:16:45.307064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.821 [2024-11-27 06:16:45.307077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.821 [2024-11-27 06:16:45.307134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001e00 cdw11:00000000 00:07:15.821 [2024-11-27 06:16:45.307147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.821 [2024-11-27 06:16:45.307201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:15.821 [2024-11-27 06:16:45.307215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.821 [2024-11-27 06:16:45.307268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.821 [2024-11-27 06:16:45.307281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.821 #51 NEW cov: 11756 ft: 14438 corp: 38/216b lim: 10 exec/s: 51 rss: 70Mb L: 10/10 MS: 1 CMP- DE: "\036\000\000\000"- 00:07:15.821 [2024-11-27 06:16:45.346854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:15.821 [2024-11-27 06:16:45.346879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.821 [2024-11-27 06:16:45.346948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000eff cdw11:00000000 00:07:15.821 [2024-11-27 06:16:45.346961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.821 [2024-11-27 06:16:45.347015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:15.821 [2024-11-27 06:16:45.347029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.079 #52 NEW cov: 11756 ft: 14506 corp: 39/223b lim: 10 exec/s: 52 rss: 70Mb L: 7/10 MS: 1 CopyPart- 00:07:16.079 [2024-11-27 06:16:45.386978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000eff cdw11:00000000 00:07:16.079 [2024-11-27 06:16:45.387003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.079 [2024-11-27 06:16:45.387076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:16.079 [2024-11-27 06:16:45.387089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.080 [2024-11-27 06:16:45.387141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a6ff cdw11:00000000 00:07:16.080 [2024-11-27 06:16:45.387154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.080 #53 NEW cov: 11756 ft: 14518 corp: 40/229b lim: 10 exec/s: 53 rss: 70Mb L: 6/10 MS: 1 InsertByte- 00:07:16.080 [2024-11-27 06:16:45.426821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:16.080 [2024-11-27 06:16:45.426846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.080 #54 NEW cov: 11756 ft: 14519 corp: 41/231b lim: 10 exec/s: 54 rss: 70Mb L: 2/10 MS: 1 EraseBytes- 00:07:16.080 [2024-11-27 06:16:45.456921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:16.080 [2024-11-27 06:16:45.456946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.080 #55 NEW cov: 11756 ft: 14573 corp: 42/234b lim: 10 exec/s: 55 rss: 70Mb L: 3/10 MS: 1 ChangeBit- 00:07:16.080 [2024-11-27 06:16:45.486981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00009494 cdw11:00000000 00:07:16.080 [2024-11-27 06:16:45.487009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.080 #57 NEW cov: 11756 ft: 14578 corp: 43/236b lim: 10 exec/s: 57 rss: 70Mb L: 2/10 MS: 2 ChangeByte-CopyPart- 00:07:16.080 [2024-11-27 06:16:45.517363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000eac cdw11:00000000 00:07:16.080 [2024-11-27 06:16:45.517388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.080 [2024-11-27 06:16:45.517442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000eff cdw11:00000000 00:07:16.080 [2024-11-27 06:16:45.517455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.080 [2024-11-27 06:16:45.517509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000aff cdw11:00000000 00:07:16.080 [2024-11-27 06:16:45.517522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.080 #58 NEW cov: 11756 ft: 14585 corp: 44/243b lim: 10 exec/s: 58 rss: 70Mb L: 7/10 MS: 1 ShuffleBytes- 00:07:16.080 [2024-11-27 06:16:45.557250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:16.080 [2024-11-27 06:16:45.557275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.080 #59 NEW cov: 11756 ft: 14595 corp: 45/245b lim: 10 exec/s: 59 rss: 70Mb L: 2/10 MS: 1 CopyPart- 00:07:16.080 [2024-11-27 06:16:45.587310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aac cdw11:00000000 00:07:16.080 [2024-11-27 06:16:45.587335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.080 #60 NEW cov: 11756 ft: 14601 corp: 46/247b lim: 10 exec/s: 30 rss: 70Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:16.080 #60 DONE cov: 11756 ft: 14601 corp: 46/247b lim: 10 exec/s: 30 rss: 70Mb 00:07:16.080 ###### Recommended dictionary. ###### 00:07:16.080 "\377\036" # Uses: 3 00:07:16.080 "\036\000\000\000" # Uses: 0 00:07:16.080 ###### End of recommended dictionary. ###### 00:07:16.080 Done 60 runs in 2 second(s) 00:07:16.338 06:16:45 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:16.338 06:16:45 -- ../common.sh@72 -- # (( i++ )) 00:07:16.338 06:16:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:16.338 06:16:45 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:16.338 06:16:45 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:16.338 06:16:45 -- nvmf/run.sh@24 -- # local timen=1 00:07:16.338 06:16:45 -- nvmf/run.sh@25 -- # local core=0x1 00:07:16.338 06:16:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:16.338 06:16:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:16.338 06:16:45 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:16.338 06:16:45 -- nvmf/run.sh@29 -- # port=4408 00:07:16.338 06:16:45 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:16.338 06:16:45 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:16.338 06:16:45 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:16.338 06:16:45 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:16.338 [2024-11-27 06:16:45.765287] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:16.338 [2024-11-27 06:16:45.765372] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid32441 ] 00:07:16.338 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.598 [2024-11-27 06:16:45.943980] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.598 [2024-11-27 06:16:46.007973] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:16.598 [2024-11-27 06:16:46.008104] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.598 [2024-11-27 06:16:46.066119] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:16.598 [2024-11-27 06:16:46.082472] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:16.598 INFO: Running with entropic power schedule (0xFF, 100). 00:07:16.598 INFO: Seed: 372306642 00:07:16.598 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:16.598 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:16.598 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:16.598 INFO: A corpus is not provided, starting from an empty corpus 00:07:16.859 [2024-11-27 06:16:46.148381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.148418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.859 #2 INITED cov: 11554 ft: 11558 corp: 1/1b exec/s: 0 rss: 66Mb 00:07:16.859 [2024-11-27 06:16:46.189460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.189489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.859 [2024-11-27 06:16:46.189579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.189602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.859 [2024-11-27 06:16:46.189720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.189735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.859 [2024-11-27 06:16:46.189847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.189863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.859 [2024-11-27 06:16:46.189979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.189995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:16.859 #3 NEW cov: 11670 ft: 13008 corp: 2/6b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:16.859 [2024-11-27 06:16:46.238783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.238813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.859 [2024-11-27 06:16:46.238938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.238955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.859 #4 NEW cov: 11676 ft: 13402 corp: 3/8b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 InsertByte- 00:07:16.859 [2024-11-27 06:16:46.278891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.278918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.859 [2024-11-27 06:16:46.279048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.279065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.859 #5 NEW cov: 11761 ft: 13706 corp: 4/10b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ChangeByte- 00:07:16.859 [2024-11-27 06:16:46.329338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.329367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.859 [2024-11-27 06:16:46.329490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.329506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.859 [2024-11-27 06:16:46.329632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.329648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.859 #6 NEW cov: 11761 ft: 13918 corp: 5/13b lim: 5 exec/s: 0 rss: 67Mb L: 3/5 MS: 1 InsertByte- 00:07:16.859 [2024-11-27 06:16:46.369789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.369818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.859 [2024-11-27 06:16:46.369938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.369955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.859 [2024-11-27 06:16:46.370076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.370092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.859 [2024-11-27 06:16:46.370206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.859 [2024-11-27 06:16:46.370224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.119 #7 NEW cov: 11761 ft: 14008 corp: 6/17b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 CrossOver- 00:07:17.119 [2024-11-27 06:16:46.420063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.119 [2024-11-27 06:16:46.420090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.119 [2024-11-27 06:16:46.420206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.119 [2024-11-27 06:16:46.420227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.119 [2024-11-27 06:16:46.420348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.119 [2024-11-27 06:16:46.420363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.119 [2024-11-27 06:16:46.420480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.119 [2024-11-27 06:16:46.420495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.119 [2024-11-27 06:16:46.420609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.119 [2024-11-27 06:16:46.420625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:17.119 #8 NEW cov: 11761 ft: 14051 corp: 7/22b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertByte- 00:07:17.119 [2024-11-27 06:16:46.470252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.119 [2024-11-27 06:16:46.470279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.119 [2024-11-27 06:16:46.470399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.119 [2024-11-27 06:16:46.470416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.119 [2024-11-27 06:16:46.470529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.119 [2024-11-27 06:16:46.470545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.120 [2024-11-27 06:16:46.470666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.120 [2024-11-27 06:16:46.470682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.120 [2024-11-27 06:16:46.470794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.120 [2024-11-27 06:16:46.470809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:17.120 #9 NEW cov: 11761 ft: 14078 corp: 8/27b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeByte- 00:07:17.120 [2024-11-27 06:16:46.519622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.120 [2024-11-27 06:16:46.519650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.120 [2024-11-27 06:16:46.519777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.120 [2024-11-27 06:16:46.519792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.120 #10 NEW cov: 11761 ft: 14121 corp: 9/29b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 CopyPart- 00:07:17.120 [2024-11-27 06:16:46.559564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.120 [2024-11-27 06:16:46.559593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.120 #11 NEW cov: 11761 ft: 14179 corp: 10/30b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeBit- 00:07:17.120 [2024-11-27 06:16:46.599896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.120 [2024-11-27 06:16:46.599922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.120 [2024-11-27 06:16:46.600052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.120 [2024-11-27 06:16:46.600068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.120 #12 NEW cov: 11761 ft: 14257 corp: 11/32b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 InsertByte- 00:07:17.120 [2024-11-27 06:16:46.640822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.120 [2024-11-27 06:16:46.640849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.120 [2024-11-27 06:16:46.640961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.120 [2024-11-27 06:16:46.640978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.120 [2024-11-27 06:16:46.641098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.120 [2024-11-27 06:16:46.641115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.120 [2024-11-27 06:16:46.641234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.120 [2024-11-27 06:16:46.641249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.120 [2024-11-27 06:16:46.641368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.120 [2024-11-27 06:16:46.641383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:17.380 #13 NEW cov: 11761 ft: 14285 corp: 12/37b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 CrossOver- 00:07:17.380 [2024-11-27 06:16:46.690976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.691006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.380 [2024-11-27 06:16:46.691134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.691148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.380 [2024-11-27 06:16:46.691265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.691280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.380 [2024-11-27 06:16:46.691400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.691419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.380 [2024-11-27 06:16:46.691525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.691542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:17.380 #14 NEW cov: 11761 ft: 14332 corp: 13/42b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeByte- 00:07:17.380 [2024-11-27 06:16:46.740378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.740404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.380 [2024-11-27 06:16:46.740514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.740529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.380 #15 NEW cov: 11761 ft: 14348 corp: 14/44b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ChangeByte- 00:07:17.380 [2024-11-27 06:16:46.780454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.780481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.380 [2024-11-27 06:16:46.780603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.780629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.380 #16 NEW cov: 11761 ft: 14383 corp: 15/46b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:07:17.380 [2024-11-27 06:16:46.821214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.821240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.380 [2024-11-27 06:16:46.821371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.821386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.380 [2024-11-27 06:16:46.821500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.821515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.380 [2024-11-27 06:16:46.821633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.821648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.380 [2024-11-27 06:16:46.821762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.821778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:17.380 #17 NEW cov: 11761 ft: 14397 corp: 16/51b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 ChangeBit- 00:07:17.380 [2024-11-27 06:16:46.870804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.380 [2024-11-27 06:16:46.870831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.381 [2024-11-27 06:16:46.870952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.381 [2024-11-27 06:16:46.870969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.381 #18 NEW cov: 11761 ft: 14450 corp: 17/53b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ChangeByte- 00:07:17.381 [2024-11-27 06:16:46.910870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.381 [2024-11-27 06:16:46.910899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.381 [2024-11-27 06:16:46.911016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.381 [2024-11-27 06:16:46.911033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.640 #19 NEW cov: 11761 ft: 14469 corp: 18/55b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:07:17.640 [2024-11-27 06:16:46.950926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.640 [2024-11-27 06:16:46.950953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.640 [2024-11-27 06:16:46.951068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.640 [2024-11-27 06:16:46.951084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.640 #20 NEW cov: 11761 ft: 14475 corp: 19/57b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:07:17.640 [2024-11-27 06:16:46.991128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.640 [2024-11-27 06:16:46.991155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.640 [2024-11-27 06:16:46.991284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.640 [2024-11-27 06:16:46.991299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.900 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:17.900 #21 NEW cov: 11784 ft: 14485 corp: 20/59b lim: 5 exec/s: 21 rss: 69Mb L: 2/5 MS: 1 EraseBytes- 00:07:17.900 [2024-11-27 06:16:47.313110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.900 [2024-11-27 06:16:47.313152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.900 [2024-11-27 06:16:47.313296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.900 [2024-11-27 06:16:47.313313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.900 [2024-11-27 06:16:47.313420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.900 [2024-11-27 06:16:47.313440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.900 [2024-11-27 06:16:47.313579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.900 [2024-11-27 06:16:47.313601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.900 [2024-11-27 06:16:47.313735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.900 [2024-11-27 06:16:47.313751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:17.900 #22 NEW cov: 11784 ft: 14640 corp: 21/64b lim: 5 exec/s: 22 rss: 69Mb L: 5/5 MS: 1 CopyPart- 00:07:17.900 [2024-11-27 06:16:47.362433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.900 [2024-11-27 06:16:47.362465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.900 [2024-11-27 06:16:47.362615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.900 [2024-11-27 06:16:47.362634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.900 #23 NEW cov: 11784 ft: 14673 corp: 22/66b lim: 5 exec/s: 23 rss: 69Mb L: 2/5 MS: 1 CopyPart- 00:07:17.900 [2024-11-27 06:16:47.413498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.900 [2024-11-27 06:16:47.413528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.900 [2024-11-27 06:16:47.413686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.900 [2024-11-27 06:16:47.413706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.900 [2024-11-27 06:16:47.413848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.900 [2024-11-27 06:16:47.413865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.900 [2024-11-27 06:16:47.413999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.900 [2024-11-27 06:16:47.414017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.900 [2024-11-27 06:16:47.414162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.900 [2024-11-27 06:16:47.414179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.160 #24 NEW cov: 11784 ft: 14737 corp: 23/71b lim: 5 exec/s: 24 rss: 69Mb L: 5/5 MS: 1 ChangeBinInt- 00:07:18.160 [2024-11-27 06:16:47.463639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.463671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.463806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.463823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.463966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.463982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.464112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.464127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.464269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.464286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.160 #25 NEW cov: 11784 ft: 14810 corp: 24/76b lim: 5 exec/s: 25 rss: 69Mb L: 5/5 MS: 1 ShuffleBytes- 00:07:18.160 [2024-11-27 06:16:47.523885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.523916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.524050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.524067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.524204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.524222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.524354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.524369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.524502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.524519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.160 #26 NEW cov: 11784 ft: 14826 corp: 25/81b lim: 5 exec/s: 26 rss: 69Mb L: 5/5 MS: 1 CopyPart- 00:07:18.160 [2024-11-27 06:16:47.583422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.583450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.583586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.583606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.583738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.583755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.160 #27 NEW cov: 11784 ft: 14836 corp: 26/84b lim: 5 exec/s: 27 rss: 69Mb L: 3/5 MS: 1 EraseBytes- 00:07:18.160 [2024-11-27 06:16:47.633827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.633857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.634003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.634021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.634165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.634184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.160 [2024-11-27 06:16:47.634322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.634341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.160 #28 NEW cov: 11784 ft: 14844 corp: 27/88b lim: 5 exec/s: 28 rss: 69Mb L: 4/5 MS: 1 CopyPart- 00:07:18.160 [2024-11-27 06:16:47.693473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.160 [2024-11-27 06:16:47.693502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.161 [2024-11-27 06:16:47.693640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.161 [2024-11-27 06:16:47.693670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.420 #29 NEW cov: 11784 ft: 14849 corp: 28/90b lim: 5 exec/s: 29 rss: 69Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:18.420 [2024-11-27 06:16:47.753613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.753654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.420 [2024-11-27 06:16:47.753779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.753798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.420 #30 NEW cov: 11784 ft: 14861 corp: 29/92b lim: 5 exec/s: 30 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:18.420 [2024-11-27 06:16:47.813795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.813827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.420 [2024-11-27 06:16:47.813972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.813992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.420 #31 NEW cov: 11784 ft: 14877 corp: 30/94b lim: 5 exec/s: 31 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:18.420 [2024-11-27 06:16:47.864604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.864642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.420 [2024-11-27 06:16:47.864784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.864802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.420 [2024-11-27 06:16:47.864938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.864956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.420 [2024-11-27 06:16:47.865088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.865106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.420 #32 NEW cov: 11784 ft: 14895 corp: 31/98b lim: 5 exec/s: 32 rss: 70Mb L: 4/5 MS: 1 EraseBytes- 00:07:18.420 [2024-11-27 06:16:47.915048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.915079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.420 [2024-11-27 06:16:47.915220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.915239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.420 [2024-11-27 06:16:47.915376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.915393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.420 [2024-11-27 06:16:47.915527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.915544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.420 [2024-11-27 06:16:47.915685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.420 [2024-11-27 06:16:47.915703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.420 #33 NEW cov: 11784 ft: 14910 corp: 32/103b lim: 5 exec/s: 33 rss: 70Mb L: 5/5 MS: 1 ChangeByte- 00:07:18.680 [2024-11-27 06:16:47.964334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.680 [2024-11-27 06:16:47.964362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.680 [2024-11-27 06:16:47.964495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.680 [2024-11-27 06:16:47.964516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.680 #34 NEW cov: 11784 ft: 14926 corp: 33/105b lim: 5 exec/s: 34 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:18.680 [2024-11-27 06:16:48.015357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.680 [2024-11-27 06:16:48.015384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.680 [2024-11-27 06:16:48.015528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.680 [2024-11-27 06:16:48.015543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.680 [2024-11-27 06:16:48.015689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.680 [2024-11-27 06:16:48.015706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.680 [2024-11-27 06:16:48.015824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.680 [2024-11-27 06:16:48.015841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.680 [2024-11-27 06:16:48.015984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.680 [2024-11-27 06:16:48.016000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.680 #35 NEW cov: 11784 ft: 14963 corp: 34/110b lim: 5 exec/s: 35 rss: 70Mb L: 5/5 MS: 1 ChangeBinInt- 00:07:18.680 [2024-11-27 06:16:48.074891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.680 [2024-11-27 06:16:48.074918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.680 [2024-11-27 06:16:48.075067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.680 [2024-11-27 06:16:48.075086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.680 [2024-11-27 06:16:48.075222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.680 [2024-11-27 06:16:48.075238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.681 #36 NEW cov: 11784 ft: 14970 corp: 35/113b lim: 5 exec/s: 36 rss: 70Mb L: 3/5 MS: 1 EraseBytes- 00:07:18.681 [2024-11-27 06:16:48.135795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.681 [2024-11-27 06:16:48.135823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.681 [2024-11-27 06:16:48.135958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.681 [2024-11-27 06:16:48.135974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.681 [2024-11-27 06:16:48.136111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.681 [2024-11-27 06:16:48.136127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.681 [2024-11-27 06:16:48.136267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.681 [2024-11-27 06:16:48.136284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.681 [2024-11-27 06:16:48.136421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.681 [2024-11-27 06:16:48.136439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.681 #37 NEW cov: 11784 ft: 15007 corp: 36/118b lim: 5 exec/s: 18 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:07:18.681 #37 DONE cov: 11784 ft: 15007 corp: 36/118b lim: 5 exec/s: 18 rss: 70Mb 00:07:18.681 Done 37 runs in 2 second(s) 00:07:18.941 06:16:48 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:18.941 06:16:48 -- ../common.sh@72 -- # (( i++ )) 00:07:18.941 06:16:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:18.941 06:16:48 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:18.941 06:16:48 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:18.941 06:16:48 -- nvmf/run.sh@24 -- # local timen=1 00:07:18.941 06:16:48 -- nvmf/run.sh@25 -- # local core=0x1 00:07:18.941 06:16:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:18.941 06:16:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:18.941 06:16:48 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:18.941 06:16:48 -- nvmf/run.sh@29 -- # port=4409 00:07:18.941 06:16:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:18.941 06:16:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:18.941 06:16:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:18.941 06:16:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:18.941 [2024-11-27 06:16:48.313091] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:18.941 [2024-11-27 06:16:48.313159] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid32985 ] 00:07:18.941 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.201 [2024-11-27 06:16:48.495690] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.201 [2024-11-27 06:16:48.559552] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:19.201 [2024-11-27 06:16:48.559691] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.201 [2024-11-27 06:16:48.617949] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:19.201 [2024-11-27 06:16:48.634282] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:19.201 INFO: Running with entropic power schedule (0xFF, 100). 00:07:19.201 INFO: Seed: 2924320932 00:07:19.201 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:19.201 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:19.201 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:19.201 INFO: A corpus is not provided, starting from an empty corpus 00:07:19.201 [2024-11-27 06:16:48.700334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.201 [2024-11-27 06:16:48.700371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.201 #2 INITED cov: 11557 ft: 11553 corp: 1/1b exec/s: 0 rss: 66Mb 00:07:19.548 [2024-11-27 06:16:48.750320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.548 [2024-11-27 06:16:48.750349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.548 #3 NEW cov: 11670 ft: 12118 corp: 2/2b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ShuffleBytes- 00:07:19.548 [2024-11-27 06:16:48.790418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.548 [2024-11-27 06:16:48.790444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.548 #4 NEW cov: 11676 ft: 12416 corp: 3/3b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeByte- 00:07:19.548 [2024-11-27 06:16:48.830574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.548 [2024-11-27 06:16:48.830604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.548 #5 NEW cov: 11761 ft: 12725 corp: 4/4b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ShuffleBytes- 00:07:19.548 [2024-11-27 06:16:48.870887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.548 [2024-11-27 06:16:48.870914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.548 [2024-11-27 06:16:48.871031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.548 [2024-11-27 06:16:48.871046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.548 #6 NEW cov: 11761 ft: 13474 corp: 5/6b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CopyPart- 00:07:19.548 [2024-11-27 06:16:48.910804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.548 [2024-11-27 06:16:48.910832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.548 #7 NEW cov: 11761 ft: 13519 corp: 6/7b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeByte- 00:07:19.548 [2024-11-27 06:16:48.951438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.548 [2024-11-27 06:16:48.951466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.548 [2024-11-27 06:16:48.951596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.548 [2024-11-27 06:16:48.951619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.548 [2024-11-27 06:16:48.951736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.548 [2024-11-27 06:16:48.951752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.548 #8 NEW cov: 11761 ft: 13826 corp: 7/10b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 CrossOver- 00:07:19.548 [2024-11-27 06:16:49.000660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.549 [2024-11-27 06:16:49.000688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.549 #9 NEW cov: 11761 ft: 13875 corp: 8/11b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 EraseBytes- 00:07:19.549 [2024-11-27 06:16:49.042182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.549 [2024-11-27 06:16:49.042212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.549 [2024-11-27 06:16:49.042334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.549 [2024-11-27 06:16:49.042351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.549 [2024-11-27 06:16:49.042470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.549 [2024-11-27 06:16:49.042486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.549 [2024-11-27 06:16:49.042611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.549 [2024-11-27 06:16:49.042629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.549 [2024-11-27 06:16:49.042755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.549 [2024-11-27 06:16:49.042770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.899 #10 NEW cov: 11761 ft: 14247 corp: 9/16b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:19.899 [2024-11-27 06:16:49.082373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.082401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.082527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.082544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.082668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.082685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.082800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.082815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.082938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.082955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.899 #11 NEW cov: 11761 ft: 14326 corp: 10/21b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeByte- 00:07:19.899 [2024-11-27 06:16:49.132590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.132621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.132758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.132777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.132896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.132915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.133038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.133055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.133169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.133187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.899 #12 NEW cov: 11761 ft: 14348 corp: 11/26b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 CopyPart- 00:07:19.899 [2024-11-27 06:16:49.171530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.171559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.899 #13 NEW cov: 11761 ft: 14384 corp: 12/27b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeBit- 00:07:19.899 [2024-11-27 06:16:49.212830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.212858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.212991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.213007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.213133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.213150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.213272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.213289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.213410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.213428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.899 #14 NEW cov: 11761 ft: 14417 corp: 13/32b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ShuffleBytes- 00:07:19.899 [2024-11-27 06:16:49.261863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.261893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.899 #15 NEW cov: 11761 ft: 14447 corp: 14/33b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:19.899 [2024-11-27 06:16:49.302467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.302495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.302609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.302625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.302748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.302765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.302888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.302904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.899 #16 NEW cov: 11761 ft: 14455 corp: 15/37b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:19.899 [2024-11-27 06:16:49.342144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.342173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.899 #17 NEW cov: 11761 ft: 14470 corp: 16/38b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:07:19.899 [2024-11-27 06:16:49.382451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.382478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.382646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.382665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.382785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.382801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.899 #18 NEW cov: 11761 ft: 14500 corp: 17/41b lim: 5 exec/s: 0 rss: 68Mb L: 3/5 MS: 1 EraseBytes- 00:07:19.899 [2024-11-27 06:16:49.422520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.422549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.899 [2024-11-27 06:16:49.422689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.899 [2024-11-27 06:16:49.422706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.158 #19 NEW cov: 11761 ft: 14510 corp: 18/43b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 CrossOver- 00:07:20.158 [2024-11-27 06:16:49.472211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.158 [2024-11-27 06:16:49.472239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.158 #20 NEW cov: 11761 ft: 14541 corp: 19/44b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:07:20.158 [2024-11-27 06:16:49.512381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.158 [2024-11-27 06:16:49.512410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.158 [2024-11-27 06:16:49.512533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.158 [2024-11-27 06:16:49.512550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.158 #21 NEW cov: 11761 ft: 14571 corp: 20/46b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 CrossOver- 00:07:20.158 [2024-11-27 06:16:49.552583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.158 [2024-11-27 06:16:49.552613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.417 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:20.417 #22 NEW cov: 11784 ft: 14608 corp: 21/47b lim: 5 exec/s: 22 rss: 69Mb L: 1/5 MS: 1 CrossOver- 00:07:20.417 [2024-11-27 06:16:49.863757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.417 [2024-11-27 06:16:49.863800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.417 #23 NEW cov: 11784 ft: 14694 corp: 22/48b lim: 5 exec/s: 23 rss: 69Mb L: 1/5 MS: 1 EraseBytes- 00:07:20.417 [2024-11-27 06:16:49.924181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.417 [2024-11-27 06:16:49.924210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.417 [2024-11-27 06:16:49.924348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.417 [2024-11-27 06:16:49.924364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.676 #24 NEW cov: 11784 ft: 14803 corp: 23/50b lim: 5 exec/s: 24 rss: 69Mb L: 2/5 MS: 1 CrossOver- 00:07:20.676 [2024-11-27 06:16:49.984628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.676 [2024-11-27 06:16:49.984660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.676 [2024-11-27 06:16:49.984802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.676 [2024-11-27 06:16:49.984822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.676 [2024-11-27 06:16:49.984949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.676 [2024-11-27 06:16:49.984967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.676 #25 NEW cov: 11784 ft: 14863 corp: 24/53b lim: 5 exec/s: 25 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:07:20.676 [2024-11-27 06:16:50.034271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.676 [2024-11-27 06:16:50.034301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.676 #26 NEW cov: 11784 ft: 14869 corp: 25/54b lim: 5 exec/s: 26 rss: 69Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:20.676 [2024-11-27 06:16:50.094765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.676 [2024-11-27 06:16:50.094794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.676 [2024-11-27 06:16:50.094911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.676 [2024-11-27 06:16:50.094928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.676 #27 NEW cov: 11784 ft: 14877 corp: 26/56b lim: 5 exec/s: 27 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:07:20.676 [2024-11-27 06:16:50.144602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.676 [2024-11-27 06:16:50.144632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.676 #28 NEW cov: 11784 ft: 14936 corp: 27/57b lim: 5 exec/s: 28 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:20.676 [2024-11-27 06:16:50.195990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.676 [2024-11-27 06:16:50.196018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.676 [2024-11-27 06:16:50.196150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.676 [2024-11-27 06:16:50.196167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.676 [2024-11-27 06:16:50.196295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.676 [2024-11-27 06:16:50.196310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.676 [2024-11-27 06:16:50.196438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.676 [2024-11-27 06:16:50.196456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.676 [2024-11-27 06:16:50.196585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.676 [2024-11-27 06:16:50.196606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:20.938 #29 NEW cov: 11784 ft: 14944 corp: 28/62b lim: 5 exec/s: 29 rss: 69Mb L: 5/5 MS: 1 ChangeBit- 00:07:20.938 [2024-11-27 06:16:50.245197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.938 [2024-11-27 06:16:50.245226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.938 [2024-11-27 06:16:50.245354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.938 [2024-11-27 06:16:50.245371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.938 #30 NEW cov: 11784 ft: 14969 corp: 29/64b lim: 5 exec/s: 30 rss: 69Mb L: 2/5 MS: 1 CopyPart- 00:07:20.938 [2024-11-27 06:16:50.295413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.938 [2024-11-27 06:16:50.295441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.938 [2024-11-27 06:16:50.295575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.938 [2024-11-27 06:16:50.295591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.938 #31 NEW cov: 11784 ft: 14975 corp: 30/66b lim: 5 exec/s: 31 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:07:20.938 [2024-11-27 06:16:50.345625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.938 [2024-11-27 06:16:50.345653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.938 [2024-11-27 06:16:50.345779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.938 [2024-11-27 06:16:50.345797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.938 #32 NEW cov: 11784 ft: 14988 corp: 31/68b lim: 5 exec/s: 32 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:07:20.938 [2024-11-27 06:16:50.396382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.938 [2024-11-27 06:16:50.396410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.938 [2024-11-27 06:16:50.396535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.938 [2024-11-27 06:16:50.396551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.938 [2024-11-27 06:16:50.396684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.938 [2024-11-27 06:16:50.396701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.938 [2024-11-27 06:16:50.396838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.938 [2024-11-27 06:16:50.396854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.938 #33 NEW cov: 11784 ft: 14994 corp: 32/72b lim: 5 exec/s: 33 rss: 70Mb L: 4/5 MS: 1 CopyPart- 00:07:20.938 [2024-11-27 06:16:50.455647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.938 [2024-11-27 06:16:50.455679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.198 #34 NEW cov: 11784 ft: 15003 corp: 33/73b lim: 5 exec/s: 34 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:07:21.198 [2024-11-27 06:16:50.507093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.507120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.198 [2024-11-27 06:16:50.507254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.507270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.198 [2024-11-27 06:16:50.507398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.507415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.198 [2024-11-27 06:16:50.507550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.507567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.198 [2024-11-27 06:16:50.507694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.507713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:21.198 #35 NEW cov: 11784 ft: 15035 corp: 34/78b lim: 5 exec/s: 35 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:21.198 [2024-11-27 06:16:50.566350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.566378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.198 [2024-11-27 06:16:50.566513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.566530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.198 #36 NEW cov: 11784 ft: 15052 corp: 35/80b lim: 5 exec/s: 36 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:07:21.198 [2024-11-27 06:16:50.617347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.617376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.198 [2024-11-27 06:16:50.617505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.617522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.198 [2024-11-27 06:16:50.617634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.617653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.198 [2024-11-27 06:16:50.617786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.617803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.198 [2024-11-27 06:16:50.617944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.617961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:21.198 #37 NEW cov: 11784 ft: 15062 corp: 36/85b lim: 5 exec/s: 37 rss: 70Mb L: 5/5 MS: 1 ChangeBit- 00:07:21.198 [2024-11-27 06:16:50.676454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.198 [2024-11-27 06:16:50.676483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.198 #38 NEW cov: 11784 ft: 15070 corp: 37/86b lim: 5 exec/s: 19 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:07:21.198 #38 DONE cov: 11784 ft: 15070 corp: 37/86b lim: 5 exec/s: 19 rss: 70Mb 00:07:21.198 Done 38 runs in 2 second(s) 00:07:21.458 06:16:50 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:21.458 06:16:50 -- ../common.sh@72 -- # (( i++ )) 00:07:21.458 06:16:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:21.458 06:16:50 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:21.458 06:16:50 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:21.458 06:16:50 -- nvmf/run.sh@24 -- # local timen=1 00:07:21.458 06:16:50 -- nvmf/run.sh@25 -- # local core=0x1 00:07:21.458 06:16:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:21.458 06:16:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:21.458 06:16:50 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:21.458 06:16:50 -- nvmf/run.sh@29 -- # port=4410 00:07:21.458 06:16:50 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:21.458 06:16:50 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:21.458 06:16:50 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:21.458 06:16:50 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:21.458 [2024-11-27 06:16:50.860320] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:21.458 [2024-11-27 06:16:50.860389] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid33417 ] 00:07:21.458 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.717 [2024-11-27 06:16:51.046122] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.717 [2024-11-27 06:16:51.111878] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:21.717 [2024-11-27 06:16:51.112005] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.717 [2024-11-27 06:16:51.169804] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:21.717 [2024-11-27 06:16:51.186142] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:21.717 INFO: Running with entropic power schedule (0xFF, 100). 00:07:21.717 INFO: Seed: 1180333191 00:07:21.717 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:21.717 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:21.717 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:21.717 INFO: A corpus is not provided, starting from an empty corpus 00:07:21.717 #2 INITED exec/s: 0 rss: 60Mb 00:07:21.717 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:21.717 This may also happen if the target rejected all inputs we tried so far 00:07:21.717 [2024-11-27 06:16:51.234102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:10101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.717 [2024-11-27 06:16:51.234130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.235 NEW_FUNC[1/670]: 0x447688 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:22.235 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:22.235 #22 NEW cov: 11580 ft: 11581 corp: 2/14b lim: 40 exec/s: 0 rss: 68Mb L: 13/13 MS: 5 CrossOver-ShuffleBytes-ChangeByte-CopyPart-InsertRepeatedBytes- 00:07:22.235 [2024-11-27 06:16:51.534867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:004921e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.235 [2024-11-27 06:16:51.534907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.235 #27 NEW cov: 11693 ft: 12073 corp: 3/27b lim: 40 exec/s: 0 rss: 68Mb L: 13/13 MS: 5 InsertByte-ChangeBinInt-InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:22.235 [2024-11-27 06:16:51.574853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:004921e7 cdw11:e7e7e70a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.235 [2024-11-27 06:16:51.574878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.235 #28 NEW cov: 11699 ft: 12365 corp: 4/41b lim: 40 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 CrossOver- 00:07:22.235 [2024-11-27 06:16:51.614954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:004920e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.235 [2024-11-27 06:16:51.614979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.235 #29 NEW cov: 11784 ft: 12617 corp: 5/54b lim: 40 exec/s: 0 rss: 68Mb L: 13/14 MS: 1 ChangeBit- 00:07:22.235 [2024-11-27 06:16:51.655085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:004921e7 cdw11:e9e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.235 [2024-11-27 06:16:51.655109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.235 #30 NEW cov: 11784 ft: 12702 corp: 6/67b lim: 40 exec/s: 0 rss: 68Mb L: 13/14 MS: 1 ChangeBinInt- 00:07:22.235 [2024-11-27 06:16:51.685420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:004920e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.235 [2024-11-27 06:16:51.685445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.235 [2024-11-27 06:16:51.685502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7ba1010 cdw11:10101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.235 [2024-11-27 06:16:51.685516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.235 [2024-11-27 06:16:51.685573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:101010e7 cdw11:e7e7e710 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.235 [2024-11-27 06:16:51.685586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.235 #31 NEW cov: 11784 ft: 13208 corp: 7/93b lim: 40 exec/s: 0 rss: 68Mb L: 26/26 MS: 1 CrossOver- 00:07:22.235 [2024-11-27 06:16:51.725306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:004921e7 cdw11:e9e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.235 [2024-11-27 06:16:51.725330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.235 #32 NEW cov: 11784 ft: 13274 corp: 8/106b lim: 40 exec/s: 0 rss: 68Mb L: 13/26 MS: 1 ShuffleBytes- 00:07:22.235 [2024-11-27 06:16:51.765455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:004921e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.235 [2024-11-27 06:16:51.765480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.495 #33 NEW cov: 11784 ft: 13338 corp: 9/119b lim: 40 exec/s: 0 rss: 68Mb L: 13/26 MS: 1 CMP- DE: "\001\000\002\000"- 00:07:22.495 [2024-11-27 06:16:51.795507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:10101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.495 [2024-11-27 06:16:51.795531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.495 #34 NEW cov: 11784 ft: 13387 corp: 10/132b lim: 40 exec/s: 0 rss: 68Mb L: 13/26 MS: 1 ChangeBit- 00:07:22.495 [2024-11-27 06:16:51.835643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:10101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.495 [2024-11-27 06:16:51.835667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.495 #35 NEW cov: 11784 ft: 13425 corp: 11/145b lim: 40 exec/s: 0 rss: 68Mb L: 13/26 MS: 1 ChangeBit- 00:07:22.495 [2024-11-27 06:16:51.875743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:10101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.495 [2024-11-27 06:16:51.875768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.495 #36 NEW cov: 11784 ft: 13467 corp: 12/158b lim: 40 exec/s: 0 rss: 68Mb L: 13/26 MS: 1 ChangeBit- 00:07:22.495 [2024-11-27 06:16:51.915855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.495 [2024-11-27 06:16:51.915880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.495 #37 NEW cov: 11784 ft: 13507 corp: 13/167b lim: 40 exec/s: 0 rss: 69Mb L: 9/26 MS: 1 EraseBytes- 00:07:22.495 [2024-11-27 06:16:51.955976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00e7e7e7 cdw11:e7e7e7ed SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.495 [2024-11-27 06:16:51.956000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.495 #38 NEW cov: 11784 ft: 13572 corp: 14/176b lim: 40 exec/s: 0 rss: 69Mb L: 9/26 MS: 1 ChangeBinInt- 00:07:22.495 [2024-11-27 06:16:51.996328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:38383838 cdw11:38383838 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.495 [2024-11-27 06:16:51.996353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.495 [2024-11-27 06:16:51.996425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:38383838 cdw11:38383838 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.495 [2024-11-27 06:16:51.996439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.495 [2024-11-27 06:16:51.996498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:3800e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.495 [2024-11-27 06:16:51.996514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.495 #39 NEW cov: 11784 ft: 13599 corp: 15/202b lim: 40 exec/s: 0 rss: 69Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:22.754 [2024-11-27 06:16:52.036203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:004921e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.754 [2024-11-27 06:16:52.036229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.754 #40 NEW cov: 11784 ft: 13664 corp: 16/215b lim: 40 exec/s: 0 rss: 69Mb L: 13/26 MS: 1 ChangeBit- 00:07:22.754 [2024-11-27 06:16:52.076331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:10101011 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.754 [2024-11-27 06:16:52.076358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.754 #41 NEW cov: 11784 ft: 13677 corp: 17/225b lim: 40 exec/s: 0 rss: 69Mb L: 10/26 MS: 1 EraseBytes- 00:07:22.754 [2024-11-27 06:16:52.116450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba100049 cdw11:20e7e710 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.754 [2024-11-27 06:16:52.116476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.754 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:22.754 #42 NEW cov: 11807 ft: 13736 corp: 18/235b lim: 40 exec/s: 0 rss: 69Mb L: 10/26 MS: 1 CrossOver- 00:07:22.754 [2024-11-27 06:16:52.156553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:20000200 cdw11:0a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.754 [2024-11-27 06:16:52.156579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.754 #45 NEW cov: 11807 ft: 13758 corp: 19/247b lim: 40 exec/s: 0 rss: 69Mb L: 12/26 MS: 3 PersAutoDict-ChangeByte-InsertRepeatedBytes- DE: "\001\000\002\000"- 00:07:22.754 [2024-11-27 06:16:52.196668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.754 [2024-11-27 06:16:52.196693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.754 #46 NEW cov: 11807 ft: 13766 corp: 20/256b lim: 40 exec/s: 0 rss: 69Mb L: 9/26 MS: 1 ShuffleBytes- 00:07:22.754 [2024-11-27 06:16:52.237019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.754 [2024-11-27 06:16:52.237045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.754 [2024-11-27 06:16:52.237102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.754 [2024-11-27 06:16:52.237116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.755 [2024-11-27 06:16:52.237171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.755 [2024-11-27 06:16:52.237184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.755 #48 NEW cov: 11807 ft: 13771 corp: 21/286b lim: 40 exec/s: 48 rss: 69Mb L: 30/30 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:22.755 [2024-11-27 06:16:52.277000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:bae7e7e7 cdw11:e7e71010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.755 [2024-11-27 06:16:52.277025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.755 [2024-11-27 06:16:52.277102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:10101010 cdw11:10141011 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:22.755 [2024-11-27 06:16:52.277116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.014 #49 NEW cov: 11807 ft: 13963 corp: 22/304b lim: 40 exec/s: 49 rss: 69Mb L: 18/30 MS: 1 CrossOver- 00:07:23.014 [2024-11-27 06:16:52.317109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:20000200 cdw11:0a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.014 [2024-11-27 06:16:52.317134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.014 [2024-11-27 06:16:52.317193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.014 [2024-11-27 06:16:52.317206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.014 #50 NEW cov: 11807 ft: 13984 corp: 23/324b lim: 40 exec/s: 50 rss: 69Mb L: 20/30 MS: 1 CopyPart- 00:07:23.014 [2024-11-27 06:16:52.357133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.014 [2024-11-27 06:16:52.357158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.014 #51 NEW cov: 11807 ft: 14055 corp: 24/333b lim: 40 exec/s: 51 rss: 69Mb L: 9/30 MS: 1 CopyPart- 00:07:23.014 [2024-11-27 06:16:52.397255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:10101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.014 [2024-11-27 06:16:52.397281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.014 #52 NEW cov: 11807 ft: 14229 corp: 25/346b lim: 40 exec/s: 52 rss: 69Mb L: 13/30 MS: 1 ShuffleBytes- 00:07:23.014 [2024-11-27 06:16:52.437340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:004921e7 cdw11:e9e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.014 [2024-11-27 06:16:52.437366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.014 #53 NEW cov: 11807 ft: 14237 corp: 26/359b lim: 40 exec/s: 53 rss: 69Mb L: 13/30 MS: 1 CopyPart- 00:07:23.014 [2024-11-27 06:16:52.477431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:20010002 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.014 [2024-11-27 06:16:52.477456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.014 #54 NEW cov: 11807 ft: 14256 corp: 27/371b lim: 40 exec/s: 54 rss: 69Mb L: 12/30 MS: 1 PersAutoDict- DE: "\001\000\002\000"- 00:07:23.014 [2024-11-27 06:16:52.517688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:bae7e7e7 cdw11:e7e71810 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.014 [2024-11-27 06:16:52.517714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.014 [2024-11-27 06:16:52.517773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:10101010 cdw11:10141011 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.014 [2024-11-27 06:16:52.517786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.014 #55 NEW cov: 11807 ft: 14297 corp: 28/389b lim: 40 exec/s: 55 rss: 69Mb L: 18/30 MS: 1 ChangeBit- 00:07:23.274 [2024-11-27 06:16:52.557728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00e7c7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.274 [2024-11-27 06:16:52.557757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.274 #56 NEW cov: 11807 ft: 14312 corp: 29/398b lim: 40 exec/s: 56 rss: 70Mb L: 9/30 MS: 1 ChangeBit- 00:07:23.274 [2024-11-27 06:16:52.597840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0021e949 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.274 [2024-11-27 06:16:52.597865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.274 #57 NEW cov: 11807 ft: 14382 corp: 30/411b lim: 40 exec/s: 57 rss: 70Mb L: 13/30 MS: 1 ShuffleBytes- 00:07:23.274 [2024-11-27 06:16:52.637948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:004921e7 cdw11:e9e72ae7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.274 [2024-11-27 06:16:52.637974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.274 #58 NEW cov: 11807 ft: 14384 corp: 31/424b lim: 40 exec/s: 58 rss: 70Mb L: 13/30 MS: 1 ChangeByte- 00:07:23.274 [2024-11-27 06:16:52.668065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:14101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.274 [2024-11-27 06:16:52.668090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.274 #59 NEW cov: 11807 ft: 14436 corp: 32/437b lim: 40 exec/s: 59 rss: 70Mb L: 13/30 MS: 1 ShuffleBytes- 00:07:23.274 [2024-11-27 06:16:52.708290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00e77474 cdw11:74747474 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.274 [2024-11-27 06:16:52.708315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.274 [2024-11-27 06:16:52.708391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:7474e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.274 [2024-11-27 06:16:52.708405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.274 #60 NEW cov: 11807 ft: 14441 corp: 33/454b lim: 40 exec/s: 60 rss: 70Mb L: 17/30 MS: 1 InsertRepeatedBytes- 00:07:23.274 [2024-11-27 06:16:52.748256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00e72fe7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.274 [2024-11-27 06:16:52.748281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.274 #61 NEW cov: 11807 ft: 14447 corp: 34/464b lim: 40 exec/s: 61 rss: 70Mb L: 10/30 MS: 1 InsertByte- 00:07:23.274 [2024-11-27 06:16:52.788379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101012 cdw11:10141010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.274 [2024-11-27 06:16:52.788404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.534 #62 NEW cov: 11807 ft: 14480 corp: 35/477b lim: 40 exec/s: 62 rss: 70Mb L: 13/30 MS: 1 ShuffleBytes- 00:07:23.534 [2024-11-27 06:16:52.828518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:11101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:52.828542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.534 #63 NEW cov: 11807 ft: 14482 corp: 36/490b lim: 40 exec/s: 63 rss: 70Mb L: 13/30 MS: 1 ChangeBit- 00:07:23.534 [2024-11-27 06:16:52.858623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:10101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:52.858651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.534 #64 NEW cov: 11807 ft: 14507 corp: 37/504b lim: 40 exec/s: 64 rss: 70Mb L: 14/30 MS: 1 InsertByte- 00:07:23.534 [2024-11-27 06:16:52.899015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00004920 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:52.899040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.534 [2024-11-27 06:16:52.899112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7e7ba10 cdw11:10101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:52.899125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.534 [2024-11-27 06:16:52.899184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:10101010 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:52.899197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.534 #65 NEW cov: 11807 ft: 14529 corp: 38/530b lim: 40 exec/s: 65 rss: 70Mb L: 26/30 MS: 1 CopyPart- 00:07:23.534 [2024-11-27 06:16:52.938868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:14101110 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:52.938892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.534 #66 NEW cov: 11807 ft: 14543 corp: 39/539b lim: 40 exec/s: 66 rss: 70Mb L: 9/30 MS: 1 EraseBytes- 00:07:23.534 [2024-11-27 06:16:52.969055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:10101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:52.969081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.534 [2024-11-27 06:16:52.969139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:10101110 cdw11:10431010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:52.969153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.534 #67 NEW cov: 11807 ft: 14545 corp: 40/555b lim: 40 exec/s: 67 rss: 70Mb L: 16/30 MS: 1 CopyPart- 00:07:23.534 [2024-11-27 06:16:53.009440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:bae7e7e7 cdw11:93939393 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:53.009465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.534 [2024-11-27 06:16:53.009524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:93939393 cdw11:93939393 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:53.009538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.534 [2024-11-27 06:16:53.009601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:93939393 cdw11:9393e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:53.009614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.534 [2024-11-27 06:16:53.009671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:10101010 cdw11:10101014 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:53.009684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.534 #68 NEW cov: 11807 ft: 14994 corp: 41/591b lim: 40 exec/s: 68 rss: 70Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:23.534 [2024-11-27 06:16:53.049301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:101010ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:53.049326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.534 [2024-11-27 06:16:53.049384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff1010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.534 [2024-11-27 06:16:53.049398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.795 #69 NEW cov: 11807 ft: 14999 corp: 42/611b lim: 40 exec/s: 69 rss: 70Mb L: 20/36 MS: 1 InsertRepeatedBytes- 00:07:23.795 [2024-11-27 06:16:53.089409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba101010 cdw11:10101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.795 [2024-11-27 06:16:53.089435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.795 [2024-11-27 06:16:53.089496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:10101010 cdw11:10101010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.795 [2024-11-27 06:16:53.089509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.795 #70 NEW cov: 11807 ft: 15010 corp: 43/634b lim: 40 exec/s: 70 rss: 70Mb L: 23/36 MS: 1 CopyPart- 00:07:23.795 [2024-11-27 06:16:53.129556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba100049 cdw11:20e7e710 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.795 [2024-11-27 06:16:53.129582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.795 [2024-11-27 06:16:53.129644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:10e70049 cdw11:21e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.795 [2024-11-27 06:16:53.129658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.795 #71 NEW cov: 11807 ft: 15028 corp: 44/650b lim: 40 exec/s: 71 rss: 70Mb L: 16/36 MS: 1 CrossOver- 00:07:23.795 [2024-11-27 06:16:53.169544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:20010019 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.795 [2024-11-27 06:16:53.169569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.795 #72 NEW cov: 11807 ft: 15036 corp: 45/662b lim: 40 exec/s: 72 rss: 70Mb L: 12/36 MS: 1 CMP- DE: "\031\000\000\000"- 00:07:23.795 [2024-11-27 06:16:53.209651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba100049 cdw11:20e7e710 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.795 [2024-11-27 06:16:53.209676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.795 #73 NEW cov: 11807 ft: 15087 corp: 46/672b lim: 40 exec/s: 36 rss: 70Mb L: 10/36 MS: 1 ChangeBit- 00:07:23.795 #73 DONE cov: 11807 ft: 15087 corp: 46/672b lim: 40 exec/s: 36 rss: 70Mb 00:07:23.795 ###### Recommended dictionary. ###### 00:07:23.795 "\001\000\002\000" # Uses: 2 00:07:23.795 "\031\000\000\000" # Uses: 0 00:07:23.795 ###### End of recommended dictionary. ###### 00:07:23.795 Done 73 runs in 2 second(s) 00:07:24.054 06:16:53 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:24.054 06:16:53 -- ../common.sh@72 -- # (( i++ )) 00:07:24.054 06:16:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:24.054 06:16:53 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:24.054 06:16:53 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:24.054 06:16:53 -- nvmf/run.sh@24 -- # local timen=1 00:07:24.054 06:16:53 -- nvmf/run.sh@25 -- # local core=0x1 00:07:24.054 06:16:53 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:24.054 06:16:53 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:24.055 06:16:53 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:24.055 06:16:53 -- nvmf/run.sh@29 -- # port=4411 00:07:24.055 06:16:53 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:24.055 06:16:53 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:24.055 06:16:53 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:24.055 06:16:53 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:24.055 [2024-11-27 06:16:53.387800] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:24.055 [2024-11-27 06:16:53.387869] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid33828 ] 00:07:24.055 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.055 [2024-11-27 06:16:53.564064] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.314 [2024-11-27 06:16:53.628218] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:24.314 [2024-11-27 06:16:53.628360] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.314 [2024-11-27 06:16:53.686495] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:24.314 [2024-11-27 06:16:53.702882] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:24.314 INFO: Running with entropic power schedule (0xFF, 100). 00:07:24.314 INFO: Seed: 3699347072 00:07:24.314 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:24.314 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:24.314 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:24.314 INFO: A corpus is not provided, starting from an empty corpus 00:07:24.314 #2 INITED exec/s: 0 rss: 60Mb 00:07:24.314 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:24.314 This may also happen if the target rejected all inputs we tried so far 00:07:24.314 [2024-11-27 06:16:53.768814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9ffffff cdw11:f7f7f7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.314 [2024-11-27 06:16:53.768851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.574 NEW_FUNC[1/671]: 0x4493f8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:24.574 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:24.574 #7 NEW cov: 11591 ft: 11592 corp: 2/13b lim: 40 exec/s: 0 rss: 68Mb L: 12/12 MS: 5 CopyPart-CMP-ChangeBinInt-ChangeBinInt-InsertRepeatedBytes- DE: "\377\377\377\377"- 00:07:24.574 [2024-11-27 06:16:54.090626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0ac8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.574 [2024-11-27 06:16:54.090677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.574 [2024-11-27 06:16:54.090825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.574 [2024-11-27 06:16:54.090849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.574 [2024-11-27 06:16:54.090997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.574 [2024-11-27 06:16:54.091019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.574 [2024-11-27 06:16:54.091167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.574 [2024-11-27 06:16:54.091190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.834 #8 NEW cov: 11705 ft: 12972 corp: 3/47b lim: 40 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:24.834 [2024-11-27 06:16:54.149764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9ffffff cdw11:f7f7f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.149795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.834 #9 NEW cov: 11711 ft: 13292 corp: 4/59b lim: 40 exec/s: 0 rss: 68Mb L: 12/34 MS: 1 ChangeBit- 00:07:24.834 [2024-11-27 06:16:54.199952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9ffffff cdw11:f7f7f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.199980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.834 #10 NEW cov: 11796 ft: 13590 corp: 5/71b lim: 40 exec/s: 0 rss: 68Mb L: 12/34 MS: 1 ChangeBit- 00:07:24.834 [2024-11-27 06:16:54.260921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9ffffff cdw11:f7f7f7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.260951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.834 [2024-11-27 06:16:54.261090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.261107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.834 [2024-11-27 06:16:54.261231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.261248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.834 [2024-11-27 06:16:54.261372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.261391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.834 #11 NEW cov: 11796 ft: 13671 corp: 6/108b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:07:24.834 [2024-11-27 06:16:54.311131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0ac8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.311159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.834 [2024-11-27 06:16:54.311306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.311324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.834 [2024-11-27 06:16:54.311461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.311477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.834 [2024-11-27 06:16:54.311615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:bdc8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.311631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.834 #12 NEW cov: 11796 ft: 13749 corp: 7/143b lim: 40 exec/s: 0 rss: 68Mb L: 35/37 MS: 1 InsertByte- 00:07:24.834 [2024-11-27 06:16:54.360892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9ffffff cdw11:f7ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.360919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.834 [2024-11-27 06:16:54.361052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.361069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.834 [2024-11-27 06:16:54.361205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.361222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.834 [2024-11-27 06:16:54.361351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.834 [2024-11-27 06:16:54.361369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.093 #13 NEW cov: 11796 ft: 13838 corp: 8/180b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:25.093 [2024-11-27 06:16:54.410616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9ffffff cdw11:f7f7f7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.093 [2024-11-27 06:16:54.410644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.093 #14 NEW cov: 11796 ft: 13874 corp: 9/194b lim: 40 exec/s: 0 rss: 68Mb L: 14/37 MS: 1 CrossOver- 00:07:25.093 [2024-11-27 06:16:54.450636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9ffffff cdw11:f7f7f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.093 [2024-11-27 06:16:54.450663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.093 #15 NEW cov: 11796 ft: 13971 corp: 10/206b lim: 40 exec/s: 0 rss: 68Mb L: 12/37 MS: 1 ChangeBinInt- 00:07:25.093 [2024-11-27 06:16:54.491092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9f9ffff cdw11:fff7f7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.093 [2024-11-27 06:16:54.491119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.093 [2024-11-27 06:16:54.491261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fff7f7f7 cdw11:fffffff7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.093 [2024-11-27 06:16:54.491277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.093 #16 NEW cov: 11796 ft: 14260 corp: 11/229b lim: 40 exec/s: 0 rss: 68Mb L: 23/37 MS: 1 CopyPart- 00:07:25.093 [2024-11-27 06:16:54.541743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0ac8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.094 [2024-11-27 06:16:54.541772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.094 [2024-11-27 06:16:54.541910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.094 [2024-11-27 06:16:54.541928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.094 [2024-11-27 06:16:54.542062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.094 [2024-11-27 06:16:54.542078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.094 [2024-11-27 06:16:54.542213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.094 [2024-11-27 06:16:54.542231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.094 #17 NEW cov: 11796 ft: 14381 corp: 12/263b lim: 40 exec/s: 0 rss: 68Mb L: 34/37 MS: 1 ShuffleBytes- 00:07:25.094 [2024-11-27 06:16:54.580924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9ffffff cdw11:0c00f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.094 [2024-11-27 06:16:54.580951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.094 #18 NEW cov: 11796 ft: 14428 corp: 13/275b lim: 40 exec/s: 0 rss: 69Mb L: 12/37 MS: 1 ChangeBinInt- 00:07:25.094 [2024-11-27 06:16:54.621151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.094 [2024-11-27 06:16:54.621179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.353 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:25.353 #19 NEW cov: 11819 ft: 14542 corp: 14/287b lim: 40 exec/s: 0 rss: 69Mb L: 12/37 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:25.353 [2024-11-27 06:16:54.660871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9ffffff cdw11:0c00f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.660900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.353 #20 NEW cov: 11819 ft: 14559 corp: 15/296b lim: 40 exec/s: 0 rss: 69Mb L: 9/37 MS: 1 EraseBytes- 00:07:25.353 [2024-11-27 06:16:54.702204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.702233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.353 [2024-11-27 06:16:54.702364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000000f9 cdw11:f9ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.702381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.353 [2024-11-27 06:16:54.702515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:f7f7f7ff cdw11:f7f7f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.702531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.353 [2024-11-27 06:16:54.702661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fffff7f7 cdw11:f7fff7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.702677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.353 #21 NEW cov: 11819 ft: 14580 corp: 16/330b lim: 40 exec/s: 0 rss: 69Mb L: 34/37 MS: 1 InsertRepeatedBytes- 00:07:25.353 [2024-11-27 06:16:54.751230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9ffffff cdw11:f700f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.751258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.353 #22 NEW cov: 11819 ft: 14600 corp: 17/339b lim: 40 exec/s: 22 rss: 69Mb L: 9/37 MS: 1 CrossOver- 00:07:25.353 [2024-11-27 06:16:54.791958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.791986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.353 [2024-11-27 06:16:54.792115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000000f9 cdw11:f9ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.792130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.353 [2024-11-27 06:16:54.792260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:f7f7f7ff cdw11:f7f7f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.792277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.353 [2024-11-27 06:16:54.792405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fffff7f7 cdw11:f7fff7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.792422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.353 #23 NEW cov: 11819 ft: 14657 corp: 18/373b lim: 40 exec/s: 23 rss: 69Mb L: 34/37 MS: 1 CrossOver- 00:07:25.353 [2024-11-27 06:16:54.832576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0ac8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.832606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.353 [2024-11-27 06:16:54.832745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.832762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.353 [2024-11-27 06:16:54.832898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.832915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.353 [2024-11-27 06:16:54.833055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fffff7f7 cdw11:f7f7f7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.833070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.353 #24 NEW cov: 11819 ft: 14678 corp: 19/407b lim: 40 exec/s: 24 rss: 69Mb L: 34/37 MS: 1 CrossOver- 00:07:25.353 [2024-11-27 06:16:54.882453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0ac8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.353 [2024-11-27 06:16:54.882481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.354 [2024-11-27 06:16:54.882617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8ffc8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.354 [2024-11-27 06:16:54.882630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.354 [2024-11-27 06:16:54.882652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.354 [2024-11-27 06:16:54.882662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.354 [2024-11-27 06:16:54.882678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fffff7f7 cdw11:f7f7f7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.354 [2024-11-27 06:16:54.882688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.613 #25 NEW cov: 11819 ft: 14684 corp: 20/441b lim: 40 exec/s: 25 rss: 69Mb L: 34/37 MS: 1 CopyPart- 00:07:25.613 [2024-11-27 06:16:54.921915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:54.921943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.613 #26 NEW cov: 11819 ft: 14694 corp: 21/453b lim: 40 exec/s: 26 rss: 70Mb L: 12/37 MS: 1 ChangeBinInt- 00:07:25.613 [2024-11-27 06:16:54.972979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00c8c800 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:54.973005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.613 [2024-11-27 06:16:54.973139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00f9f9ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:54.973157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.613 [2024-11-27 06:16:54.973288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:fffff7f7 cdw11:f7fff7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:54.973304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.613 [2024-11-27 06:16:54.973442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:f7ffffff cdw11:f7f7f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:54.973458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.613 #27 NEW cov: 11819 ft: 14716 corp: 22/489b lim: 40 exec/s: 27 rss: 70Mb L: 36/37 MS: 1 CrossOver- 00:07:25.613 [2024-11-27 06:16:55.033130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:3d000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:55.033161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.613 [2024-11-27 06:16:55.033307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000000f9 cdw11:f9ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:55.033324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.613 [2024-11-27 06:16:55.033457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:f7f7f7ff cdw11:f7f7f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:55.033475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.613 [2024-11-27 06:16:55.033609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fffff7f7 cdw11:f7fff7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:55.033629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.613 #28 NEW cov: 11819 ft: 14781 corp: 23/523b lim: 40 exec/s: 28 rss: 70Mb L: 34/37 MS: 1 ChangeByte- 00:07:25.613 [2024-11-27 06:16:55.083414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00c8c800 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:55.083440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.613 [2024-11-27 06:16:55.083587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00f9f9ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:55.083610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.613 [2024-11-27 06:16:55.083748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:fffff7f7 cdw11:f7fff7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:55.083765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.613 [2024-11-27 06:16:55.083894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:f7fffeff cdw11:f7f7f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:55.083910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.613 #29 NEW cov: 11819 ft: 14788 corp: 24/559b lim: 40 exec/s: 29 rss: 70Mb L: 36/37 MS: 1 ChangeBit- 00:07:25.613 [2024-11-27 06:16:55.142795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9ffffff cdw11:f7f7f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.613 [2024-11-27 06:16:55.142824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.873 #30 NEW cov: 11819 ft: 14826 corp: 25/571b lim: 40 exec/s: 30 rss: 70Mb L: 12/37 MS: 1 ChangeBit- 00:07:25.874 [2024-11-27 06:16:55.193695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0000ad00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.193726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.874 [2024-11-27 06:16:55.193868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f9f9ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.193884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.874 [2024-11-27 06:16:55.194023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:fff7f7f7 cdw11:fff7f7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.194042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.874 [2024-11-27 06:16:55.194191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fffffff7 cdw11:f7f7fff7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.194208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.874 #31 NEW cov: 11819 ft: 14853 corp: 26/606b lim: 40 exec/s: 31 rss: 70Mb L: 35/37 MS: 1 InsertByte- 00:07:25.874 [2024-11-27 06:16:55.243221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:0a0a0a0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.243250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.874 [2024-11-27 06:16:55.243381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0a0a0a0a cdw11:0a0a0a0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.243401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.874 #32 NEW cov: 11819 ft: 14862 corp: 27/625b lim: 40 exec/s: 32 rss: 70Mb L: 19/37 MS: 1 InsertRepeatedBytes- 00:07:25.874 [2024-11-27 06:16:55.293393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9f9ffff cdw11:fff7f7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.293424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.874 [2024-11-27 06:16:55.293560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fff7f7f7 cdw11:fffff7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.293578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.874 #33 NEW cov: 11819 ft: 14883 corp: 28/643b lim: 40 exec/s: 33 rss: 70Mb L: 18/37 MS: 1 EraseBytes- 00:07:25.874 [2024-11-27 06:16:55.334110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0ac8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.334139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.874 [2024-11-27 06:16:55.334285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:c8c822c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.334303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.874 [2024-11-27 06:16:55.334432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.334452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.874 [2024-11-27 06:16:55.334574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.334590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.874 #34 NEW cov: 11819 ft: 14897 corp: 29/677b lim: 40 exec/s: 34 rss: 70Mb L: 34/37 MS: 1 ChangeBinInt- 00:07:25.874 [2024-11-27 06:16:55.383353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9fffff7 cdw11:ff00fff7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.874 [2024-11-27 06:16:55.383382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.134 #35 NEW cov: 11819 ft: 14912 corp: 30/686b lim: 40 exec/s: 35 rss: 70Mb L: 9/37 MS: 1 ShuffleBytes- 00:07:26.134 [2024-11-27 06:16:55.444459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f70ac8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.134 [2024-11-27 06:16:55.444491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.134 [2024-11-27 06:16:55.444638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.134 [2024-11-27 06:16:55.444667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.134 [2024-11-27 06:16:55.444787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.134 [2024-11-27 06:16:55.444803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.134 [2024-11-27 06:16:55.444939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8f7f7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.134 [2024-11-27 06:16:55.444957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.134 #38 NEW cov: 11819 ft: 14929 corp: 31/720b lim: 40 exec/s: 38 rss: 70Mb L: 34/37 MS: 3 EraseBytes-ShuffleBytes-CrossOver- 00:07:26.134 [2024-11-27 06:16:55.494291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0ac8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.134 [2024-11-27 06:16:55.494319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.134 [2024-11-27 06:16:55.494451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:c8c822c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.134 [2024-11-27 06:16:55.494468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.134 [2024-11-27 06:16:55.494614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.134 [2024-11-27 06:16:55.494630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.134 #39 NEW cov: 11819 ft: 15134 corp: 32/751b lim: 40 exec/s: 39 rss: 70Mb L: 31/37 MS: 1 EraseBytes- 00:07:26.134 [2024-11-27 06:16:55.543457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f9ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.134 [2024-11-27 06:16:55.543484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.134 #40 NEW cov: 11819 ft: 15141 corp: 33/765b lim: 40 exec/s: 40 rss: 70Mb L: 14/37 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:26.134 [2024-11-27 06:16:55.594896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f7f70ac8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.134 [2024-11-27 06:16:55.594924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.134 [2024-11-27 06:16:55.595052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.134 [2024-11-27 06:16:55.595068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.134 [2024-11-27 06:16:55.595202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.134 [2024-11-27 06:16:55.595218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.135 [2024-11-27 06:16:55.595347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8f7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.135 [2024-11-27 06:16:55.595364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.135 #41 NEW cov: 11819 ft: 15174 corp: 34/800b lim: 40 exec/s: 41 rss: 70Mb L: 35/37 MS: 1 CopyPart- 00:07:26.135 [2024-11-27 06:16:55.655033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0ac8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.135 [2024-11-27 06:16:55.655060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.135 [2024-11-27 06:16:55.655195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:c8c822c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.135 [2024-11-27 06:16:55.655212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.135 [2024-11-27 06:16:55.655342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.135 [2024-11-27 06:16:55.655358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.135 [2024-11-27 06:16:55.655491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.135 [2024-11-27 06:16:55.655506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.395 #42 NEW cov: 11819 ft: 15227 corp: 35/834b lim: 40 exec/s: 42 rss: 70Mb L: 34/37 MS: 1 ChangeBit- 00:07:26.395 [2024-11-27 06:16:55.705019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0ac8c838 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.395 [2024-11-27 06:16:55.705045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.395 [2024-11-27 06:16:55.705183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3737e2c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.395 [2024-11-27 06:16:55.705201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.395 [2024-11-27 06:16:55.705303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.395 [2024-11-27 06:16:55.705320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.395 #43 NEW cov: 11819 ft: 15234 corp: 36/865b lim: 40 exec/s: 43 rss: 70Mb L: 31/37 MS: 1 ChangeBinInt- 00:07:26.395 [2024-11-27 06:16:55.755365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:3d000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.395 [2024-11-27 06:16:55.755392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.395 [2024-11-27 06:16:55.755529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00fbfff9 cdw11:f9ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.395 [2024-11-27 06:16:55.755548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.395 [2024-11-27 06:16:55.755681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:f7f7f7ff cdw11:f7f7f7ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.395 [2024-11-27 06:16:55.755699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.395 [2024-11-27 06:16:55.755830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fffff7f7 cdw11:f7fff7f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.395 [2024-11-27 06:16:55.755847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.395 #44 NEW cov: 11819 ft: 15236 corp: 37/899b lim: 40 exec/s: 22 rss: 70Mb L: 34/37 MS: 1 ChangeBinInt- 00:07:26.395 #44 DONE cov: 11819 ft: 15236 corp: 37/899b lim: 40 exec/s: 22 rss: 70Mb 00:07:26.395 ###### Recommended dictionary. ###### 00:07:26.395 "\377\377\377\377" # Uses: 1 00:07:26.395 "\377\377\377\377\377\377\377\377" # Uses: 1 00:07:26.395 ###### End of recommended dictionary. ###### 00:07:26.395 Done 44 runs in 2 second(s) 00:07:26.395 06:16:55 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:07:26.395 06:16:55 -- ../common.sh@72 -- # (( i++ )) 00:07:26.395 06:16:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:26.395 06:16:55 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:26.395 06:16:55 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:26.395 06:16:55 -- nvmf/run.sh@24 -- # local timen=1 00:07:26.395 06:16:55 -- nvmf/run.sh@25 -- # local core=0x1 00:07:26.395 06:16:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:26.395 06:16:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:26.395 06:16:55 -- nvmf/run.sh@29 -- # printf %02d 12 00:07:26.395 06:16:55 -- nvmf/run.sh@29 -- # port=4412 00:07:26.395 06:16:55 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:26.395 06:16:55 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:26.395 06:16:55 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:26.395 06:16:55 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:07:26.655 [2024-11-27 06:16:55.940172] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:26.655 [2024-11-27 06:16:55.940237] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid34365 ] 00:07:26.655 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.655 [2024-11-27 06:16:56.116771] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.655 [2024-11-27 06:16:56.180559] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:26.655 [2024-11-27 06:16:56.180708] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.915 [2024-11-27 06:16:56.238765] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:26.915 [2024-11-27 06:16:56.255084] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:26.915 INFO: Running with entropic power schedule (0xFF, 100). 00:07:26.915 INFO: Seed: 1956388938 00:07:26.915 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:26.915 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:26.915 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:26.915 INFO: A corpus is not provided, starting from an empty corpus 00:07:26.915 #2 INITED exec/s: 0 rss: 61Mb 00:07:26.915 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:26.915 This may also happen if the target rejected all inputs we tried so far 00:07:26.915 [2024-11-27 06:16:56.310480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.915 [2024-11-27 06:16:56.310509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.915 [2024-11-27 06:16:56.310580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.915 [2024-11-27 06:16:56.310594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.174 NEW_FUNC[1/671]: 0x44b168 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:27.174 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:27.174 #9 NEW cov: 11590 ft: 11580 corp: 2/19b lim: 40 exec/s: 0 rss: 68Mb L: 18/18 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:27.174 [2024-11-27 06:16:56.621314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.174 [2024-11-27 06:16:56.621353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.174 [2024-11-27 06:16:56.621426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.174 [2024-11-27 06:16:56.621444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.174 #10 NEW cov: 11703 ft: 12123 corp: 3/41b lim: 40 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 CMP- DE: "\022\000\000\000"- 00:07:27.174 [2024-11-27 06:16:56.671158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.174 [2024-11-27 06:16:56.671186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.174 #16 NEW cov: 11709 ft: 13083 corp: 4/51b lim: 40 exec/s: 0 rss: 68Mb L: 10/22 MS: 1 EraseBytes- 00:07:27.434 [2024-11-27 06:16:56.711422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.711449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.434 [2024-11-27 06:16:56.711506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.711521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.434 #17 NEW cov: 11794 ft: 13399 corp: 5/73b lim: 40 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 ChangeBinInt- 00:07:27.434 [2024-11-27 06:16:56.751858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.751886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.434 [2024-11-27 06:16:56.751941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.751955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.434 [2024-11-27 06:16:56.752009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ebeb1200 cdw11:05ebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.752022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.434 [2024-11-27 06:16:56.752077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.752090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.434 #23 NEW cov: 11794 ft: 13788 corp: 6/111b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 CopyPart- 00:07:27.434 [2024-11-27 06:16:56.791652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.791678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.434 [2024-11-27 06:16:56.791736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebeb1eeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.791750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.434 #24 NEW cov: 11794 ft: 13861 corp: 7/133b lim: 40 exec/s: 0 rss: 68Mb L: 22/38 MS: 1 ChangeByte- 00:07:27.434 [2024-11-27 06:16:56.831906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.831935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.434 [2024-11-27 06:16:56.831992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.832005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.434 [2024-11-27 06:16:56.832064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ebebebeb cdw11:eb120005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.832077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.434 #25 NEW cov: 11794 ft: 14155 corp: 8/159b lim: 40 exec/s: 0 rss: 68Mb L: 26/38 MS: 1 CrossOver- 00:07:27.434 [2024-11-27 06:16:56.872025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.872051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.434 [2024-11-27 06:16:56.872126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.872140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.434 [2024-11-27 06:16:56.872197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ebebebeb cdw11:eb12000d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.872210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.434 #26 NEW cov: 11794 ft: 14177 corp: 9/185b lim: 40 exec/s: 0 rss: 69Mb L: 26/38 MS: 1 ChangeBit- 00:07:27.434 [2024-11-27 06:16:56.911836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.911861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.434 #27 NEW cov: 11794 ft: 14219 corp: 10/199b lim: 40 exec/s: 0 rss: 69Mb L: 14/38 MS: 1 CrossOver- 00:07:27.434 [2024-11-27 06:16:56.952503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.952529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.434 [2024-11-27 06:16:56.952585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.952603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.434 [2024-11-27 06:16:56.952662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.952675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.434 [2024-11-27 06:16:56.952732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ebebebeb cdw11:12000500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.434 [2024-11-27 06:16:56.952746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.694 #28 NEW cov: 11794 ft: 14290 corp: 11/232b lim: 40 exec/s: 0 rss: 69Mb L: 33/38 MS: 1 CrossOver- 00:07:27.694 [2024-11-27 06:16:56.992176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.694 [2024-11-27 06:16:56.992201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.694 #29 NEW cov: 11794 ft: 14390 corp: 12/240b lim: 40 exec/s: 0 rss: 69Mb L: 8/38 MS: 1 EraseBytes- 00:07:27.694 [2024-11-27 06:16:57.032302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebeb3aeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.694 [2024-11-27 06:16:57.032328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.694 #30 NEW cov: 11794 ft: 14398 corp: 13/254b lim: 40 exec/s: 0 rss: 69Mb L: 14/38 MS: 1 ChangeByte- 00:07:27.694 [2024-11-27 06:16:57.072912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.694 [2024-11-27 06:16:57.072938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.694 [2024-11-27 06:16:57.072994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.694 [2024-11-27 06:16:57.073008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.694 [2024-11-27 06:16:57.073066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.694 [2024-11-27 06:16:57.073079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.694 [2024-11-27 06:16:57.073136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.694 [2024-11-27 06:16:57.073149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.694 #33 NEW cov: 11794 ft: 14413 corp: 14/293b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 3 InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:27.694 [2024-11-27 06:16:57.112634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a181818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.694 [2024-11-27 06:16:57.112660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.694 [2024-11-27 06:16:57.112720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:18181818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.694 [2024-11-27 06:16:57.112734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.694 #34 NEW cov: 11794 ft: 14458 corp: 15/311b lim: 40 exec/s: 0 rss: 69Mb L: 18/39 MS: 1 InsertRepeatedBytes- 00:07:27.694 [2024-11-27 06:16:57.153114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.694 [2024-11-27 06:16:57.153139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.694 [2024-11-27 06:16:57.153198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:0b141414 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.694 [2024-11-27 06:16:57.153211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.694 [2024-11-27 06:16:57.153281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ebeb1200 cdw11:05ebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.694 [2024-11-27 06:16:57.153298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.694 [2024-11-27 06:16:57.153354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.695 [2024-11-27 06:16:57.153368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.695 #35 NEW cov: 11794 ft: 14498 corp: 16/349b lim: 40 exec/s: 0 rss: 69Mb L: 38/39 MS: 1 ChangeBinInt- 00:07:27.695 [2024-11-27 06:16:57.192782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:12000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.695 [2024-11-27 06:16:57.192809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.695 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:27.695 #36 NEW cov: 11817 ft: 14525 corp: 17/363b lim: 40 exec/s: 0 rss: 69Mb L: 14/39 MS: 1 PersAutoDict- DE: "\022\000\000\000"- 00:07:27.954 [2024-11-27 06:16:57.232901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebefeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.954 [2024-11-27 06:16:57.232926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.954 #37 NEW cov: 11817 ft: 14556 corp: 18/373b lim: 40 exec/s: 0 rss: 69Mb L: 10/39 MS: 1 ChangeBit- 00:07:27.954 [2024-11-27 06:16:57.273131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1514db12 cdw11:0005ebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.954 [2024-11-27 06:16:57.273156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.954 [2024-11-27 06:16:57.273229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.954 [2024-11-27 06:16:57.273243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.954 #41 NEW cov: 11817 ft: 14565 corp: 19/392b lim: 40 exec/s: 41 rss: 69Mb L: 19/39 MS: 4 EraseBytes-InsertByte-ChangeBinInt-CrossOver- 00:07:27.954 [2024-11-27 06:16:57.313297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a181818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.954 [2024-11-27 06:16:57.313321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.954 [2024-11-27 06:16:57.313378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:18181818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.954 [2024-11-27 06:16:57.313392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.954 #42 NEW cov: 11817 ft: 14645 corp: 20/410b lim: 40 exec/s: 42 rss: 69Mb L: 18/39 MS: 1 ChangeBit- 00:07:27.954 [2024-11-27 06:16:57.353395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1514db12 cdw11:0005ebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.954 [2024-11-27 06:16:57.353419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.954 [2024-11-27 06:16:57.353477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebeb8deb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.954 [2024-11-27 06:16:57.353490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.954 #43 NEW cov: 11817 ft: 14671 corp: 21/430b lim: 40 exec/s: 43 rss: 69Mb L: 20/39 MS: 1 InsertByte- 00:07:27.954 [2024-11-27 06:16:57.393838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a181818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.954 [2024-11-27 06:16:57.393863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.954 [2024-11-27 06:16:57.393934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:18181818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.954 [2024-11-27 06:16:57.393948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.954 [2024-11-27 06:16:57.394003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:18191818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.954 [2024-11-27 06:16:57.394016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.954 [2024-11-27 06:16:57.394071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:18181818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.954 [2024-11-27 06:16:57.394084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.954 #44 NEW cov: 11817 ft: 14709 corp: 22/464b lim: 40 exec/s: 44 rss: 69Mb L: 34/39 MS: 1 CopyPart- 00:07:27.954 [2024-11-27 06:16:57.433465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:12000000 cdw11:001f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.954 [2024-11-27 06:16:57.433490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.955 #49 NEW cov: 11817 ft: 14773 corp: 23/473b lim: 40 exec/s: 49 rss: 69Mb L: 9/39 MS: 5 ChangeByte-PersAutoDict-ChangeByte-CopyPart-CMP- DE: "\022\000\000\000"-"\000\000\000\037"- 00:07:27.955 [2024-11-27 06:16:57.473602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.955 [2024-11-27 06:16:57.473626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.214 #50 NEW cov: 11817 ft: 14779 corp: 24/481b lim: 40 exec/s: 50 rss: 70Mb L: 8/39 MS: 1 EraseBytes- 00:07:28.214 [2024-11-27 06:16:57.513668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.513692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.215 #51 NEW cov: 11817 ft: 14807 corp: 25/489b lim: 40 exec/s: 51 rss: 70Mb L: 8/39 MS: 1 EraseBytes- 00:07:28.215 [2024-11-27 06:16:57.554395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.554420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.215 [2024-11-27 06:16:57.554476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:0b141414 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.554489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.215 [2024-11-27 06:16:57.554542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ebeb1200 cdw11:05ebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.554555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.215 [2024-11-27 06:16:57.554614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.554646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.215 [2024-11-27 06:16:57.554700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:ff05eb12 cdw11:0005003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.554721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:28.215 #52 NEW cov: 11817 ft: 14912 corp: 26/529b lim: 40 exec/s: 52 rss: 70Mb L: 40/40 MS: 1 CMP- DE: "\377\005"- 00:07:28.215 [2024-11-27 06:16:57.604230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.604256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.215 [2024-11-27 06:16:57.604312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.604325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.215 [2024-11-27 06:16:57.604379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ebebebeb cdw11:eb12000d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.604392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.215 #53 NEW cov: 11817 ft: 14990 corp: 27/555b lim: 40 exec/s: 53 rss: 70Mb L: 26/40 MS: 1 ShuffleBytes- 00:07:28.215 [2024-11-27 06:16:57.644553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.644579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.215 [2024-11-27 06:16:57.644633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebeb1eeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.644648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.215 [2024-11-27 06:16:57.644704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:eb12ebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.644717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.215 [2024-11-27 06:16:57.644769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ebebeb1e cdw11:ebeb1200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.644782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.215 #54 NEW cov: 11817 ft: 15000 corp: 28/594b lim: 40 exec/s: 54 rss: 70Mb L: 39/40 MS: 1 CopyPart- 00:07:28.215 [2024-11-27 06:16:57.684313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a181818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.684338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.215 [2024-11-27 06:16:57.684395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:18181818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.684409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.215 #55 NEW cov: 11817 ft: 15009 corp: 29/613b lim: 40 exec/s: 55 rss: 70Mb L: 19/40 MS: 1 InsertByte- 00:07:28.215 [2024-11-27 06:16:57.724432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.724457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.215 [2024-11-27 06:16:57.724514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebeb12 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.215 [2024-11-27 06:16:57.724527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.215 #56 NEW cov: 11817 ft: 15045 corp: 30/633b lim: 40 exec/s: 56 rss: 70Mb L: 20/40 MS: 1 EraseBytes- 00:07:28.475 [2024-11-27 06:16:57.764350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:57.764375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.475 #57 NEW cov: 11817 ft: 15103 corp: 31/643b lim: 40 exec/s: 57 rss: 70Mb L: 10/40 MS: 1 CMP- DE: "\000\000\000\001"- 00:07:28.475 [2024-11-27 06:16:57.804458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebeb46 cdw11:ebebebef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:57.804482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.475 #58 NEW cov: 11817 ft: 15200 corp: 32/654b lim: 40 exec/s: 58 rss: 70Mb L: 11/40 MS: 1 InsertByte- 00:07:28.475 [2024-11-27 06:16:57.845140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebe3eb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:57.845165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.475 [2024-11-27 06:16:57.845237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebeb1eeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:57.845251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.475 [2024-11-27 06:16:57.845306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:eb12ebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:57.845319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.475 [2024-11-27 06:16:57.845374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ebebeb1e cdw11:ebeb1200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:57.845387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.475 #59 NEW cov: 11817 ft: 15216 corp: 33/693b lim: 40 exec/s: 59 rss: 70Mb L: 39/40 MS: 1 ChangeBit- 00:07:28.475 [2024-11-27 06:16:57.884892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebeb46 cdw11:ebebebd4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:57.884916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.475 [2024-11-27 06:16:57.884990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:d4d4d4d4 cdw11:d4d4efeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:57.885004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.475 #60 NEW cov: 11817 ft: 15243 corp: 34/711b lim: 40 exec/s: 60 rss: 70Mb L: 18/40 MS: 1 InsertRepeatedBytes- 00:07:28.475 [2024-11-27 06:16:57.924824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebeb46 cdw11:ebefebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:57.924851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.475 #61 NEW cov: 11817 ft: 15258 corp: 35/720b lim: 40 exec/s: 61 rss: 70Mb L: 9/40 MS: 1 EraseBytes- 00:07:28.475 [2024-11-27 06:16:57.965112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1514db12 cdw11:0003ebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:57.965137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.475 [2024-11-27 06:16:57.965208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:57.965222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.475 #62 NEW cov: 11817 ft: 15298 corp: 36/739b lim: 40 exec/s: 62 rss: 70Mb L: 19/40 MS: 1 ChangeBinInt- 00:07:28.475 [2024-11-27 06:16:58.005558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:58.005582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.475 [2024-11-27 06:16:58.005644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:58.005658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.475 [2024-11-27 06:16:58.005714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:58.005727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.475 [2024-11-27 06:16:58.005782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ebebebeb cdw11:eb12000d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.475 [2024-11-27 06:16:58.005795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.735 #63 NEW cov: 11817 ft: 15309 corp: 37/773b lim: 40 exec/s: 63 rss: 70Mb L: 34/40 MS: 1 InsertRepeatedBytes- 00:07:28.735 [2024-11-27 06:16:58.045205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:2cebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.735 [2024-11-27 06:16:58.045230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.735 #64 NEW cov: 11817 ft: 15340 corp: 38/784b lim: 40 exec/s: 64 rss: 70Mb L: 11/40 MS: 1 InsertByte- 00:07:28.735 [2024-11-27 06:16:58.085300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebeb2d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.735 [2024-11-27 06:16:58.085324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.735 #65 NEW cov: 11817 ft: 15390 corp: 39/793b lim: 40 exec/s: 65 rss: 70Mb L: 9/40 MS: 1 InsertByte- 00:07:28.735 [2024-11-27 06:16:58.125384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:abebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.735 [2024-11-27 06:16:58.125408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.735 #66 NEW cov: 11817 ft: 15397 corp: 40/803b lim: 40 exec/s: 66 rss: 70Mb L: 10/40 MS: 1 ChangeBit- 00:07:28.735 [2024-11-27 06:16:58.165969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.735 [2024-11-27 06:16:58.165996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.735 [2024-11-27 06:16:58.166053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.735 [2024-11-27 06:16:58.166066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.735 [2024-11-27 06:16:58.166121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ebeb1200 cdw11:05ebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.735 [2024-11-27 06:16:58.166134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.735 [2024-11-27 06:16:58.166189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ebebebeb cdw11:ebebeb00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.735 [2024-11-27 06:16:58.166202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.735 #67 NEW cov: 11817 ft: 15414 corp: 41/841b lim: 40 exec/s: 67 rss: 70Mb L: 38/40 MS: 1 PersAutoDict- DE: "\000\000\000\037"- 00:07:28.735 [2024-11-27 06:16:58.205782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:db120003 cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.735 [2024-11-27 06:16:58.205809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.735 [2024-11-27 06:16:58.205865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebeb13eb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.735 [2024-11-27 06:16:58.205878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.735 #68 NEW cov: 11817 ft: 15435 corp: 42/858b lim: 40 exec/s: 68 rss: 70Mb L: 17/40 MS: 1 EraseBytes- 00:07:28.735 [2024-11-27 06:16:58.245824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ebebebeb cdw11:2deb0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.735 [2024-11-27 06:16:58.245850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.995 #70 NEW cov: 11817 ft: 15446 corp: 43/873b lim: 40 exec/s: 70 rss: 70Mb L: 15/40 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:28.995 [2024-11-27 06:16:58.286327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a181818 cdw11:21181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.995 [2024-11-27 06:16:58.286352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.995 [2024-11-27 06:16:58.286425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:18181818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.995 [2024-11-27 06:16:58.286439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.995 [2024-11-27 06:16:58.286495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:18191818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.995 [2024-11-27 06:16:58.286508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.995 [2024-11-27 06:16:58.286564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:18181818 cdw11:18181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.995 [2024-11-27 06:16:58.286578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.995 #76 NEW cov: 11817 ft: 15459 corp: 44/907b lim: 40 exec/s: 38 rss: 70Mb L: 34/40 MS: 1 ChangeBinInt- 00:07:28.995 #76 DONE cov: 11817 ft: 15459 corp: 44/907b lim: 40 exec/s: 38 rss: 70Mb 00:07:28.995 ###### Recommended dictionary. ###### 00:07:28.995 "\022\000\000\000" # Uses: 3 00:07:28.995 "\000\000\000\037" # Uses: 2 00:07:28.995 "\377\005" # Uses: 0 00:07:28.995 "\000\000\000\001" # Uses: 0 00:07:28.995 ###### End of recommended dictionary. ###### 00:07:28.995 Done 76 runs in 2 second(s) 00:07:28.995 06:16:58 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:07:28.995 06:16:58 -- ../common.sh@72 -- # (( i++ )) 00:07:28.995 06:16:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:28.995 06:16:58 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:28.995 06:16:58 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:28.995 06:16:58 -- nvmf/run.sh@24 -- # local timen=1 00:07:28.995 06:16:58 -- nvmf/run.sh@25 -- # local core=0x1 00:07:28.995 06:16:58 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:28.995 06:16:58 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:28.995 06:16:58 -- nvmf/run.sh@29 -- # printf %02d 13 00:07:28.995 06:16:58 -- nvmf/run.sh@29 -- # port=4413 00:07:28.995 06:16:58 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:28.995 06:16:58 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:28.995 06:16:58 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:28.995 06:16:58 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:07:28.995 [2024-11-27 06:16:58.468051] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:28.995 [2024-11-27 06:16:58.468123] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid34766 ] 00:07:28.995 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.255 [2024-11-27 06:16:58.648499] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.255 [2024-11-27 06:16:58.712910] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:29.255 [2024-11-27 06:16:58.713051] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.255 [2024-11-27 06:16:58.771114] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:29.255 [2024-11-27 06:16:58.787509] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:29.515 INFO: Running with entropic power schedule (0xFF, 100). 00:07:29.515 INFO: Seed: 191409649 00:07:29.515 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:29.515 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:29.515 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:29.515 INFO: A corpus is not provided, starting from an empty corpus 00:07:29.515 #2 INITED exec/s: 0 rss: 60Mb 00:07:29.515 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:29.515 This may also happen if the target rejected all inputs we tried so far 00:07:29.515 [2024-11-27 06:16:58.836328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.515 [2024-11-27 06:16:58.836356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.515 [2024-11-27 06:16:58.836425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.515 [2024-11-27 06:16:58.836439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.515 [2024-11-27 06:16:58.836496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.515 [2024-11-27 06:16:58.836509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.515 [2024-11-27 06:16:58.836562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.515 [2024-11-27 06:16:58.836574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.788 NEW_FUNC[1/668]: 0x44cd38 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:29.788 NEW_FUNC[2/668]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:29.788 #15 NEW cov: 11565 ft: 11566 corp: 2/38b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 3 CrossOver-EraseBytes-InsertRepeatedBytes- 00:07:29.788 [2024-11-27 06:16:59.137047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.137078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.788 [2024-11-27 06:16:59.137146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.137159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.788 [2024-11-27 06:16:59.137213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.137227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.788 [2024-11-27 06:16:59.137279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.137292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.788 NEW_FUNC[1/2]: 0x1546c88 in nvme_ctrlr_get_ready_timeout /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:1211 00:07:29.788 NEW_FUNC[2/2]: 0x19999d8 in sock_group_impl_poll_count /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/sock/sock.c:710 00:07:29.788 #21 NEW cov: 11691 ft: 12008 corp: 3/75b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 ChangeBit- 00:07:29.788 [2024-11-27 06:16:59.187071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:5858582c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.187097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.788 [2024-11-27 06:16:59.187154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.187168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.788 [2024-11-27 06:16:59.187220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.187233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.788 [2024-11-27 06:16:59.187288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.187303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.788 #22 NEW cov: 11697 ft: 12212 corp: 4/112b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 ChangeByte- 00:07:29.788 [2024-11-27 06:16:59.227222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.227249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.788 [2024-11-27 06:16:59.227300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.227313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.788 [2024-11-27 06:16:59.227366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.227379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.788 [2024-11-27 06:16:59.227429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.227441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.788 #23 NEW cov: 11782 ft: 12590 corp: 5/149b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 CopyPart- 00:07:29.788 [2024-11-27 06:16:59.267317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.267343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.788 [2024-11-27 06:16:59.267398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.788 [2024-11-27 06:16:59.267411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.788 [2024-11-27 06:16:59.267461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.789 [2024-11-27 06:16:59.267474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.789 [2024-11-27 06:16:59.267524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:585858a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.789 [2024-11-27 06:16:59.267537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.789 #24 NEW cov: 11782 ft: 12672 corp: 6/187b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 InsertByte- 00:07:29.789 [2024-11-27 06:16:59.307468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.789 [2024-11-27 06:16:59.307493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.789 [2024-11-27 06:16:59.307547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.789 [2024-11-27 06:16:59.307560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.789 [2024-11-27 06:16:59.307615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.789 [2024-11-27 06:16:59.307631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.789 [2024-11-27 06:16:59.307684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.789 [2024-11-27 06:16:59.307697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.048 #25 NEW cov: 11782 ft: 12810 corp: 7/224b lim: 40 exec/s: 0 rss: 68Mb L: 37/38 MS: 1 CrossOver- 00:07:30.048 [2024-11-27 06:16:59.347512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58595858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.048 [2024-11-27 06:16:59.347536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.048 [2024-11-27 06:16:59.347590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.048 [2024-11-27 06:16:59.347607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.048 [2024-11-27 06:16:59.347658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.048 [2024-11-27 06:16:59.347671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.048 [2024-11-27 06:16:59.347722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:585858a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.048 [2024-11-27 06:16:59.347735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.048 #26 NEW cov: 11782 ft: 12894 corp: 8/262b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 ChangeBit- 00:07:30.048 [2024-11-27 06:16:59.387660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.048 [2024-11-27 06:16:59.387685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.048 [2024-11-27 06:16:59.387737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:582e5858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.048 [2024-11-27 06:16:59.387751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.048 [2024-11-27 06:16:59.387803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.048 [2024-11-27 06:16:59.387832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.048 [2024-11-27 06:16:59.387886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.048 [2024-11-27 06:16:59.387899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.048 #27 NEW cov: 11782 ft: 12929 corp: 9/300b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 InsertByte- 00:07:30.048 [2024-11-27 06:16:59.427816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:585858f5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.048 [2024-11-27 06:16:59.427841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.048 [2024-11-27 06:16:59.427910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.427924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.049 [2024-11-27 06:16:59.427976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.427989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.049 [2024-11-27 06:16:59.428040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:585858a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.428052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.049 #28 NEW cov: 11782 ft: 12951 corp: 10/338b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ChangeByte- 00:07:30.049 [2024-11-27 06:16:59.467874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.467899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.049 [2024-11-27 06:16:59.467949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.467962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.049 [2024-11-27 06:16:59.468014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.468026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.049 [2024-11-27 06:16:59.468079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:585858a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.468091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.049 #29 NEW cov: 11782 ft: 12996 corp: 11/376b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ShuffleBytes- 00:07:30.049 [2024-11-27 06:16:59.508009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:5a5858f5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.508034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.049 [2024-11-27 06:16:59.508087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.508100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.049 [2024-11-27 06:16:59.508153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.508166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.049 [2024-11-27 06:16:59.508219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:585858a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.508231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.049 #30 NEW cov: 11782 ft: 13039 corp: 12/414b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ChangeBit- 00:07:30.049 [2024-11-27 06:16:59.548140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:5a5858f5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.548165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.049 [2024-11-27 06:16:59.548218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:5858586a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.548231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.049 [2024-11-27 06:16:59.548284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.548296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.049 [2024-11-27 06:16:59.548349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.049 [2024-11-27 06:16:59.548361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.049 #31 NEW cov: 11782 ft: 13080 corp: 13/453b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 InsertByte- 00:07:30.309 [2024-11-27 06:16:59.588288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58595858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.588312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.588364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585050 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.588388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.588439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.588451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.588504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:585858a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.588517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.309 #32 NEW cov: 11782 ft: 13114 corp: 14/491b lim: 40 exec/s: 0 rss: 69Mb L: 38/39 MS: 1 ChangeBit- 00:07:30.309 [2024-11-27 06:16:59.628264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2a585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.628289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.628343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.628356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.628409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.628421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.309 #37 NEW cov: 11782 ft: 13595 corp: 15/517b lim: 40 exec/s: 0 rss: 69Mb L: 26/39 MS: 5 ChangeBit-ChangeBit-ChangeByte-ShuffleBytes-CrossOver- 00:07:30.309 [2024-11-27 06:16:59.668454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.668478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.668531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.668544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.668602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.668631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.668684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.668696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.309 #38 NEW cov: 11782 ft: 13634 corp: 16/554b lim: 40 exec/s: 0 rss: 69Mb L: 37/39 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:30.309 [2024-11-27 06:16:59.708331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.708356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.708408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.708421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.309 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:30.309 #39 NEW cov: 11805 ft: 13930 corp: 17/573b lim: 40 exec/s: 0 rss: 69Mb L: 19/39 MS: 1 EraseBytes- 00:07:30.309 [2024-11-27 06:16:59.748484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.748509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.748565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.748578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.309 #40 NEW cov: 11805 ft: 13937 corp: 18/594b lim: 40 exec/s: 0 rss: 69Mb L: 21/39 MS: 1 EraseBytes- 00:07:30.309 [2024-11-27 06:16:59.788808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.788833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.788902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:582e5858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.788916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.788973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.788986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.789035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.789047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.309 #41 NEW cov: 11805 ft: 13957 corp: 19/632b lim: 40 exec/s: 0 rss: 69Mb L: 38/39 MS: 1 ShuffleBytes- 00:07:30.309 [2024-11-27 06:16:59.829005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:5858582c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.829031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.829084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.829097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.829150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.829163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.309 [2024-11-27 06:16:59.829216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.309 [2024-11-27 06:16:59.829228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.569 #42 NEW cov: 11805 ft: 13999 corp: 20/670b lim: 40 exec/s: 42 rss: 69Mb L: 38/39 MS: 1 InsertByte- 00:07:30.569 [2024-11-27 06:16:59.869069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.869095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.569 [2024-11-27 06:16:59.869148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.869162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.569 [2024-11-27 06:16:59.869213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.869225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.569 [2024-11-27 06:16:59.869275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.869287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.569 #43 NEW cov: 11805 ft: 14024 corp: 21/707b lim: 40 exec/s: 43 rss: 69Mb L: 37/39 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:30.569 [2024-11-27 06:16:59.909164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.909192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.569 [2024-11-27 06:16:59.909246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:58235858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.909259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.569 [2024-11-27 06:16:59.909312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.909325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.569 [2024-11-27 06:16:59.909377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.909389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.569 #44 NEW cov: 11805 ft: 14108 corp: 22/746b lim: 40 exec/s: 44 rss: 69Mb L: 39/39 MS: 1 InsertByte- 00:07:30.569 [2024-11-27 06:16:59.939339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.939364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.569 [2024-11-27 06:16:59.939431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:582e5858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.939445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.569 [2024-11-27 06:16:59.939497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585886 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.939510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.569 [2024-11-27 06:16:59.939564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.939577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.569 #45 NEW cov: 11805 ft: 14171 corp: 23/785b lim: 40 exec/s: 45 rss: 69Mb L: 39/39 MS: 1 InsertByte- 00:07:30.569 [2024-11-27 06:16:59.979361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.979387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.569 [2024-11-27 06:16:59.979456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.979470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.569 [2024-11-27 06:16:59.979522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.979535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.569 [2024-11-27 06:16:59.979588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:585858a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.569 [2024-11-27 06:16:59.979615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.569 #46 NEW cov: 11805 ft: 14201 corp: 24/823b lim: 40 exec/s: 46 rss: 69Mb L: 38/39 MS: 1 ShuffleBytes- 00:07:30.569 [2024-11-27 06:17:00.009454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.009480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.570 [2024-11-27 06:17:00.009532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585058 cdw11:23585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.009546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.570 [2024-11-27 06:17:00.009603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.009616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.570 [2024-11-27 06:17:00.009668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:585858a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.009680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.570 #47 NEW cov: 11805 ft: 14209 corp: 25/861b lim: 40 exec/s: 47 rss: 69Mb L: 38/39 MS: 1 EraseBytes- 00:07:30.570 [2024-11-27 06:17:00.049748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.049776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.570 [2024-11-27 06:17:00.049830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58582e58 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.049844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.570 [2024-11-27 06:17:00.049898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.049910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.570 [2024-11-27 06:17:00.049964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:86585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.049976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.570 [2024-11-27 06:17:00.050029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:58585858 cdw11:5858580a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.050042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:30.570 #48 NEW cov: 11805 ft: 14264 corp: 26/901b lim: 40 exec/s: 48 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:07:30.570 [2024-11-27 06:17:00.089743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:26585958 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.089771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.570 [2024-11-27 06:17:00.089825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:50585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.089841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.570 [2024-11-27 06:17:00.089896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.089909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.570 [2024-11-27 06:17:00.089962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.570 [2024-11-27 06:17:00.089974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.830 #49 NEW cov: 11805 ft: 14267 corp: 27/940b lim: 40 exec/s: 49 rss: 70Mb L: 39/40 MS: 1 InsertByte- 00:07:30.830 [2024-11-27 06:17:00.129847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.129874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.129929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.129942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.129996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.130009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.130059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:585858a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.130072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.830 #50 NEW cov: 11805 ft: 14270 corp: 28/978b lim: 40 exec/s: 50 rss: 70Mb L: 38/40 MS: 1 CopyPart- 00:07:30.830 [2024-11-27 06:17:00.169957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.169982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.170050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.170064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.170117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.170129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.170181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.170194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.830 #51 NEW cov: 11805 ft: 14278 corp: 29/1015b lim: 40 exec/s: 51 rss: 70Mb L: 37/40 MS: 1 CopyPart- 00:07:30.830 [2024-11-27 06:17:00.210077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:585858ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.210106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.210160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2c585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.210173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.210227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.210239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.210294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.210306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.240149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:585858ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.240174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.240244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2c585848 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.240258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.240311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.240324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.240378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.240391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.830 #53 NEW cov: 11805 ft: 14292 corp: 30/1053b lim: 40 exec/s: 53 rss: 70Mb L: 38/40 MS: 2 InsertByte-ChangeBit- 00:07:30.830 [2024-11-27 06:17:00.279900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000004 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.279926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.830 #54 NEW cov: 11805 ft: 14658 corp: 31/1062b lim: 40 exec/s: 54 rss: 70Mb L: 9/40 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\004"- 00:07:30.830 [2024-11-27 06:17:00.320131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.320157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.320213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:5800922d cdw11:911511cc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.320227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.830 #55 NEW cov: 11805 ft: 14744 corp: 32/1081b lim: 40 exec/s: 55 rss: 70Mb L: 19/40 MS: 1 CMP- DE: "\000\222-\221\025\021\314:"- 00:07:30.830 [2024-11-27 06:17:00.360497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:26585958 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.360523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.360578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585950 cdw11:50585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.360591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.360651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.360664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.830 [2024-11-27 06:17:00.360719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.830 [2024-11-27 06:17:00.360731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.090 #56 NEW cov: 11805 ft: 14758 corp: 33/1120b lim: 40 exec/s: 56 rss: 70Mb L: 39/40 MS: 1 ChangeBit- 00:07:31.090 [2024-11-27 06:17:00.400620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585801 cdw11:00020058 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.400645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.400700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:58235858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.400714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.400764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.400777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.400830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.400843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.090 #57 NEW cov: 11805 ft: 14775 corp: 34/1159b lim: 40 exec/s: 57 rss: 70Mb L: 39/40 MS: 1 CMP- DE: "\001\000\002\000"- 00:07:31.090 [2024-11-27 06:17:00.440727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:217f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.440751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.440821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.440835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.440888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.440901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.440959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.440972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.090 #59 NEW cov: 11805 ft: 14809 corp: 35/1197b lim: 40 exec/s: 59 rss: 70Mb L: 38/40 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:31.090 [2024-11-27 06:17:00.470807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585058 cdw11:5858582c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.470831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.470887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.470900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.470953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.470967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.471020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.471033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.090 #60 NEW cov: 11805 ft: 14875 corp: 36/1235b lim: 40 exec/s: 60 rss: 70Mb L: 38/40 MS: 1 ChangeBit- 00:07:31.090 [2024-11-27 06:17:00.510967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.510992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.511048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2c000000 cdw11:00005858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.511061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.511116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.511129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.511183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.511196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.090 #61 NEW cov: 11805 ft: 14883 corp: 37/1273b lim: 40 exec/s: 61 rss: 70Mb L: 38/40 MS: 1 InsertByte- 00:07:31.090 [2024-11-27 06:17:00.551224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.090 [2024-11-27 06:17:00.551249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.090 [2024-11-27 06:17:00.551304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:582e5858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.091 [2024-11-27 06:17:00.551320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.091 [2024-11-27 06:17:00.551372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.091 [2024-11-27 06:17:00.551385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.091 [2024-11-27 06:17:00.551438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58582e58 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.091 [2024-11-27 06:17:00.551451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.091 [2024-11-27 06:17:00.551505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:58585858 cdw11:5858580a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.091 [2024-11-27 06:17:00.551518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:31.091 #62 NEW cov: 11805 ft: 14897 corp: 38/1313b lim: 40 exec/s: 62 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:07:31.091 [2024-11-27 06:17:00.590936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.091 [2024-11-27 06:17:00.590962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.091 [2024-11-27 06:17:00.591015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:585a5858 cdw11:f5585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.091 [2024-11-27 06:17:00.591029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.091 #63 NEW cov: 11805 ft: 14947 corp: 39/1336b lim: 40 exec/s: 63 rss: 70Mb L: 23/40 MS: 1 CrossOver- 00:07:31.351 [2024-11-27 06:17:00.631092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585886 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.631117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.351 [2024-11-27 06:17:00.631170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.631184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.351 #64 NEW cov: 11805 ft: 14952 corp: 40/1359b lim: 40 exec/s: 64 rss: 70Mb L: 23/40 MS: 1 EraseBytes- 00:07:31.351 [2024-11-27 06:17:00.671150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.671175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.351 [2024-11-27 06:17:00.671230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.671243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.351 #65 NEW cov: 11805 ft: 14961 corp: 41/1380b lim: 40 exec/s: 65 rss: 70Mb L: 21/40 MS: 1 EraseBytes- 00:07:31.351 [2024-11-27 06:17:00.711259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.711284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.351 [2024-11-27 06:17:00.711339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58a05858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.711355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.351 #66 NEW cov: 11805 ft: 15018 corp: 42/1399b lim: 40 exec/s: 66 rss: 70Mb L: 19/40 MS: 1 ChangeBinInt- 00:07:31.351 [2024-11-27 06:17:00.751619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2e585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.751644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.351 [2024-11-27 06:17:00.751699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.751712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.351 [2024-11-27 06:17:00.751764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.751777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.351 [2024-11-27 06:17:00.751831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.751843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.351 #67 NEW cov: 11805 ft: 15027 corp: 43/1437b lim: 40 exec/s: 67 rss: 70Mb L: 38/40 MS: 1 InsertByte- 00:07:31.351 [2024-11-27 06:17:00.791737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.791761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.351 [2024-11-27 06:17:00.791815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:58585850 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.791828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.351 [2024-11-27 06:17:00.791881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.791893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.351 [2024-11-27 06:17:00.791946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:585858a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.791958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.351 #68 NEW cov: 11805 ft: 15036 corp: 44/1475b lim: 40 exec/s: 68 rss: 70Mb L: 38/40 MS: 1 ChangeBinInt- 00:07:31.351 [2024-11-27 06:17:00.831618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58585886 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.831642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.351 [2024-11-27 06:17:00.831697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:5c585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.351 [2024-11-27 06:17:00.831711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.351 #69 NEW cov: 11805 ft: 15057 corp: 45/1498b lim: 40 exec/s: 34 rss: 70Mb L: 23/40 MS: 1 ChangeBit- 00:07:31.351 #69 DONE cov: 11805 ft: 15057 corp: 45/1498b lim: 40 exec/s: 34 rss: 70Mb 00:07:31.351 ###### Recommended dictionary. ###### 00:07:31.351 "\000\000\000\000\000\000\000\000" # Uses: 1 00:07:31.351 "\001\000\000\000\000\000\000\004" # Uses: 0 00:07:31.351 "\000\222-\221\025\021\314:" # Uses: 0 00:07:31.351 "\001\000\002\000" # Uses: 0 00:07:31.351 ###### End of recommended dictionary. ###### 00:07:31.351 Done 69 runs in 2 second(s) 00:07:31.611 06:17:00 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:07:31.611 06:17:00 -- ../common.sh@72 -- # (( i++ )) 00:07:31.611 06:17:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:31.611 06:17:00 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:31.611 06:17:00 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:31.611 06:17:00 -- nvmf/run.sh@24 -- # local timen=1 00:07:31.611 06:17:00 -- nvmf/run.sh@25 -- # local core=0x1 00:07:31.611 06:17:00 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:31.611 06:17:00 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:31.611 06:17:00 -- nvmf/run.sh@29 -- # printf %02d 14 00:07:31.611 06:17:00 -- nvmf/run.sh@29 -- # port=4414 00:07:31.611 06:17:00 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:31.611 06:17:00 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:31.611 06:17:00 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:31.611 06:17:00 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:07:31.611 [2024-11-27 06:17:01.017193] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:31.611 [2024-11-27 06:17:01.017263] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid35196 ] 00:07:31.611 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.871 [2024-11-27 06:17:01.198348] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.871 [2024-11-27 06:17:01.261609] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:31.871 [2024-11-27 06:17:01.261750] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.871 [2024-11-27 06:17:01.319666] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:31.871 [2024-11-27 06:17:01.336028] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:31.871 INFO: Running with entropic power schedule (0xFF, 100). 00:07:31.871 INFO: Seed: 2742409977 00:07:31.871 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:31.871 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:31.871 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:31.871 INFO: A corpus is not provided, starting from an empty corpus 00:07:31.871 #2 INITED exec/s: 0 rss: 61Mb 00:07:31.871 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:31.871 This may also happen if the target rejected all inputs we tried so far 00:07:32.390 NEW_FUNC[1/658]: 0x44e908 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:32.390 NEW_FUNC[2/658]: 0x46fd38 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:32.390 #12 NEW cov: 11461 ft: 11462 corp: 2/8b lim: 35 exec/s: 0 rss: 68Mb L: 7/7 MS: 5 CopyPart-CMP-ChangeBit-ChangeBit-CopyPart- DE: "*\000"- 00:07:32.390 #13 NEW cov: 11581 ft: 12035 corp: 3/16b lim: 35 exec/s: 0 rss: 68Mb L: 8/8 MS: 1 InsertByte- 00:07:32.390 #14 NEW cov: 11587 ft: 12334 corp: 4/24b lim: 35 exec/s: 0 rss: 68Mb L: 8/8 MS: 1 CrossOver- 00:07:32.390 [2024-11-27 06:17:01.792499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.390 [2024-11-27 06:17:01.792536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.390 [2024-11-27 06:17:01.792592] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.390 [2024-11-27 06:17:01.792613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.390 NEW_FUNC[1/15]: 0x169c068 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:07:32.390 NEW_FUNC[2/15]: 0x169c2a8 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:07:32.390 #21 NEW cov: 11809 ft: 13298 corp: 5/48b lim: 35 exec/s: 0 rss: 68Mb L: 24/24 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:32.390 [2024-11-27 06:17:01.832613] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.390 [2024-11-27 06:17:01.832640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.390 [2024-11-27 06:17:01.832696] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.390 [2024-11-27 06:17:01.832709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.390 #22 NEW cov: 11809 ft: 13366 corp: 6/72b lim: 35 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 CMP- DE: "\001\035"- 00:07:32.390 #23 NEW cov: 11809 ft: 13412 corp: 7/79b lim: 35 exec/s: 0 rss: 68Mb L: 7/24 MS: 1 CopyPart- 00:07:32.650 #24 NEW cov: 11809 ft: 13503 corp: 8/89b lim: 35 exec/s: 0 rss: 69Mb L: 10/24 MS: 1 CMP- DE: "\002\000"- 00:07:32.650 #25 NEW cov: 11809 ft: 13560 corp: 9/98b lim: 35 exec/s: 0 rss: 69Mb L: 9/24 MS: 1 CopyPart- 00:07:32.650 [2024-11-27 06:17:01.992884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.650 [2024-11-27 06:17:01.992913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.650 [2024-11-27 06:17:01.992986] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.650 [2024-11-27 06:17:01.993003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.650 #26 NEW cov: 11816 ft: 13767 corp: 10/117b lim: 35 exec/s: 0 rss: 69Mb L: 19/24 MS: 1 InsertRepeatedBytes- 00:07:32.650 #27 NEW cov: 11816 ft: 13822 corp: 11/125b lim: 35 exec/s: 0 rss: 69Mb L: 8/24 MS: 1 ChangeBit- 00:07:32.650 [2024-11-27 06:17:02.073108] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.650 [2024-11-27 06:17:02.073135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.650 [2024-11-27 06:17:02.073195] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.650 [2024-11-27 06:17:02.073210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.650 #28 NEW cov: 11816 ft: 13834 corp: 12/144b lim: 35 exec/s: 0 rss: 69Mb L: 19/24 MS: 1 PersAutoDict- DE: "\001\035"- 00:07:32.651 [2024-11-27 06:17:02.113231] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.651 [2024-11-27 06:17:02.113258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.651 [2024-11-27 06:17:02.113321] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.651 [2024-11-27 06:17:02.113336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.651 #29 NEW cov: 11816 ft: 13904 corp: 13/163b lim: 35 exec/s: 0 rss: 69Mb L: 19/24 MS: 1 ShuffleBytes- 00:07:32.651 [2024-11-27 06:17:02.153334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.651 [2024-11-27 06:17:02.153361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.651 [2024-11-27 06:17:02.153420] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.651 [2024-11-27 06:17:02.153436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.651 #30 NEW cov: 11816 ft: 13939 corp: 14/182b lim: 35 exec/s: 0 rss: 69Mb L: 19/24 MS: 1 ShuffleBytes- 00:07:32.911 [2024-11-27 06:17:02.193453] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.911 [2024-11-27 06:17:02.193481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.911 [2024-11-27 06:17:02.193539] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000006d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.911 [2024-11-27 06:17:02.193555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.911 #31 NEW cov: 11816 ft: 13968 corp: 15/201b lim: 35 exec/s: 0 rss: 69Mb L: 19/24 MS: 1 ChangeBinInt- 00:07:32.911 NEW_FUNC[1/2]: 0x4691c8 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:07:32.911 NEW_FUNC[2/2]: 0x112c368 in nvmf_ctrlr_set_features_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1489 00:07:32.911 #32 NEW cov: 11873 ft: 14050 corp: 16/211b lim: 35 exec/s: 0 rss: 69Mb L: 10/24 MS: 1 CopyPart- 00:07:32.911 [2024-11-27 06:17:02.273694] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.911 [2024-11-27 06:17:02.273721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.911 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:32.911 #33 NEW cov: 11896 ft: 14251 corp: 17/229b lim: 35 exec/s: 0 rss: 69Mb L: 18/24 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:32.911 [2024-11-27 06:17:02.323871] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.911 [2024-11-27 06:17:02.323897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.911 #34 NEW cov: 11896 ft: 14300 corp: 18/247b lim: 35 exec/s: 0 rss: 69Mb L: 18/24 MS: 1 ChangeBinInt- 00:07:32.911 #40 NEW cov: 11896 ft: 14314 corp: 19/256b lim: 35 exec/s: 40 rss: 69Mb L: 9/24 MS: 1 EraseBytes- 00:07:32.911 [2024-11-27 06:17:02.404352] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.911 [2024-11-27 06:17:02.404380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.911 [2024-11-27 06:17:02.404438] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000006d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.911 [2024-11-27 06:17:02.404453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.911 [2024-11-27 06:17:02.404514] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.911 [2024-11-27 06:17:02.404528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.911 [2024-11-27 06:17:02.404585] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.911 [2024-11-27 06:17:02.404603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.911 #41 NEW cov: 11896 ft: 14616 corp: 20/288b lim: 35 exec/s: 41 rss: 70Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:33.171 #42 NEW cov: 11896 ft: 14634 corp: 21/297b lim: 35 exec/s: 42 rss: 70Mb L: 9/32 MS: 1 ChangeBit- 00:07:33.171 [2024-11-27 06:17:02.484385] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.171 [2024-11-27 06:17:02.484412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.171 [2024-11-27 06:17:02.484470] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.171 [2024-11-27 06:17:02.484486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.171 [2024-11-27 06:17:02.484543] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.171 [2024-11-27 06:17:02.484558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.171 #43 NEW cov: 11896 ft: 14757 corp: 22/318b lim: 35 exec/s: 43 rss: 70Mb L: 21/32 MS: 1 PersAutoDict- DE: "\001\035"- 00:07:33.171 #44 NEW cov: 11896 ft: 14772 corp: 23/326b lim: 35 exec/s: 44 rss: 70Mb L: 8/32 MS: 1 ShuffleBytes- 00:07:33.171 #45 NEW cov: 11896 ft: 14783 corp: 24/337b lim: 35 exec/s: 45 rss: 70Mb L: 11/32 MS: 1 CrossOver- 00:07:33.171 #46 NEW cov: 11896 ft: 14792 corp: 25/344b lim: 35 exec/s: 46 rss: 70Mb L: 7/32 MS: 1 CopyPart- 00:07:33.171 [2024-11-27 06:17:02.624508] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.171 [2024-11-27 06:17:02.624533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.171 #47 NEW cov: 11896 ft: 14809 corp: 26/351b lim: 35 exec/s: 47 rss: 70Mb L: 7/32 MS: 1 ShuffleBytes- 00:07:33.171 #48 NEW cov: 11896 ft: 14827 corp: 27/360b lim: 35 exec/s: 48 rss: 70Mb L: 9/32 MS: 1 EraseBytes- 00:07:33.171 [2024-11-27 06:17:02.704974] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.171 [2024-11-27 06:17:02.704999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.431 #49 NEW cov: 11896 ft: 14841 corp: 28/378b lim: 35 exec/s: 49 rss: 70Mb L: 18/32 MS: 1 ChangeBit- 00:07:33.431 [2024-11-27 06:17:02.745097] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.431 [2024-11-27 06:17:02.745122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.431 #50 NEW cov: 11896 ft: 14853 corp: 29/396b lim: 35 exec/s: 50 rss: 70Mb L: 18/32 MS: 1 CrossOver- 00:07:33.431 #51 NEW cov: 11896 ft: 14875 corp: 30/405b lim: 35 exec/s: 51 rss: 70Mb L: 9/32 MS: 1 ChangeByte- 00:07:33.431 #53 NEW cov: 11896 ft: 14914 corp: 31/413b lim: 35 exec/s: 53 rss: 70Mb L: 8/32 MS: 2 EraseBytes-CMP- DE: "\000\004"- 00:07:33.431 #54 NEW cov: 11896 ft: 15026 corp: 32/424b lim: 35 exec/s: 54 rss: 70Mb L: 11/32 MS: 1 InsertByte- 00:07:33.431 #55 NEW cov: 11896 ft: 15046 corp: 33/434b lim: 35 exec/s: 55 rss: 70Mb L: 10/32 MS: 1 InsertByte- 00:07:33.431 [2024-11-27 06:17:02.925618] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.431 [2024-11-27 06:17:02.925645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.431 #56 NEW cov: 11896 ft: 15074 corp: 34/452b lim: 35 exec/s: 56 rss: 70Mb L: 18/32 MS: 1 ChangeBinInt- 00:07:33.431 [2024-11-27 06:17:02.965816] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.431 [2024-11-27 06:17:02.965842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.690 #57 NEW cov: 11896 ft: 15079 corp: 35/470b lim: 35 exec/s: 57 rss: 70Mb L: 18/32 MS: 1 PersAutoDict- DE: "\000\004"- 00:07:33.690 #58 NEW cov: 11896 ft: 15167 corp: 36/480b lim: 35 exec/s: 58 rss: 70Mb L: 10/32 MS: 1 CrossOver- 00:07:33.690 #59 NEW cov: 11896 ft: 15178 corp: 37/488b lim: 35 exec/s: 59 rss: 70Mb L: 8/32 MS: 1 ChangeByte- 00:07:33.690 #65 NEW cov: 11896 ft: 15238 corp: 38/500b lim: 35 exec/s: 65 rss: 70Mb L: 12/32 MS: 1 CopyPart- 00:07:33.690 #66 NEW cov: 11896 ft: 15239 corp: 39/510b lim: 35 exec/s: 66 rss: 70Mb L: 10/32 MS: 1 ChangeBinInt- 00:07:33.690 #67 NEW cov: 11896 ft: 15286 corp: 40/531b lim: 35 exec/s: 67 rss: 70Mb L: 21/32 MS: 1 CopyPart- 00:07:33.690 #68 NEW cov: 11896 ft: 15310 corp: 41/540b lim: 35 exec/s: 68 rss: 70Mb L: 9/32 MS: 1 ChangeBit- 00:07:33.950 #69 NEW cov: 11896 ft: 15321 corp: 42/551b lim: 35 exec/s: 69 rss: 70Mb L: 11/32 MS: 1 ChangeBinInt- 00:07:33.950 [2024-11-27 06:17:03.266668] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.950 [2024-11-27 06:17:03.266695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.950 #70 NEW cov: 11896 ft: 15397 corp: 43/568b lim: 35 exec/s: 70 rss: 70Mb L: 17/32 MS: 1 CrossOver- 00:07:33.950 [2024-11-27 06:17:03.316908] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.950 [2024-11-27 06:17:03.316936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.950 [2024-11-27 06:17:03.317008] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.950 [2024-11-27 06:17:03.317023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.950 [2024-11-27 06:17:03.317080] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.950 [2024-11-27 06:17:03.317093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.950 #71 NEW cov: 11896 ft: 15428 corp: 44/589b lim: 35 exec/s: 71 rss: 70Mb L: 21/32 MS: 1 CrossOver- 00:07:33.950 [2024-11-27 06:17:03.357165] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.950 [2024-11-27 06:17:03.357194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.950 [2024-11-27 06:17:03.357253] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000006d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.950 [2024-11-27 06:17:03.357269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.950 [2024-11-27 06:17:03.357316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.950 [2024-11-27 06:17:03.357330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.950 [2024-11-27 06:17:03.357388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.950 [2024-11-27 06:17:03.357404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.950 [2024-11-27 06:17:03.397271] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.950 [2024-11-27 06:17:03.397310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.950 [2024-11-27 06:17:03.397384] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000089 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.950 [2024-11-27 06:17:03.397400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.950 [2024-11-27 06:17:03.397457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.950 [2024-11-27 06:17:03.397472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.950 [2024-11-27 06:17:03.397530] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.950 [2024-11-27 06:17:03.397544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.950 #73 NEW cov: 11896 ft: 15436 corp: 45/621b lim: 35 exec/s: 36 rss: 70Mb L: 32/32 MS: 2 ChangeBinInt-ShuffleBytes- 00:07:33.950 #73 DONE cov: 11896 ft: 15436 corp: 45/621b lim: 35 exec/s: 36 rss: 70Mb 00:07:33.950 ###### Recommended dictionary. ###### 00:07:33.950 "*\000" # Uses: 1 00:07:33.950 "\001\035" # Uses: 2 00:07:33.950 "\002\000" # Uses: 0 00:07:33.950 "\001\000\000\000\000\000\000\000" # Uses: 0 00:07:33.950 "\000\004" # Uses: 1 00:07:33.950 ###### End of recommended dictionary. ###### 00:07:33.950 Done 73 runs in 2 second(s) 00:07:34.211 06:17:03 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:07:34.211 06:17:03 -- ../common.sh@72 -- # (( i++ )) 00:07:34.211 06:17:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:34.211 06:17:03 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:07:34.211 06:17:03 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:07:34.211 06:17:03 -- nvmf/run.sh@24 -- # local timen=1 00:07:34.211 06:17:03 -- nvmf/run.sh@25 -- # local core=0x1 00:07:34.211 06:17:03 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:34.211 06:17:03 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:07:34.211 06:17:03 -- nvmf/run.sh@29 -- # printf %02d 15 00:07:34.211 06:17:03 -- nvmf/run.sh@29 -- # port=4415 00:07:34.211 06:17:03 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:34.211 06:17:03 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:07:34.211 06:17:03 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:34.211 06:17:03 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:07:34.211 [2024-11-27 06:17:03.569143] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:34.211 [2024-11-27 06:17:03.569232] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid35744 ] 00:07:34.211 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.470 [2024-11-27 06:17:03.749067] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.470 [2024-11-27 06:17:03.813381] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:34.470 [2024-11-27 06:17:03.813521] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.470 [2024-11-27 06:17:03.871381] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:34.470 [2024-11-27 06:17:03.887743] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:07:34.470 INFO: Running with entropic power schedule (0xFF, 100). 00:07:34.470 INFO: Seed: 998439785 00:07:34.470 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:34.470 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:34.470 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:34.470 INFO: A corpus is not provided, starting from an empty corpus 00:07:34.470 #2 INITED exec/s: 0 rss: 60Mb 00:07:34.470 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:34.470 This may also happen if the target rejected all inputs we tried so far 00:07:34.470 [2024-11-27 06:17:03.933106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.470 [2024-11-27 06:17:03.933135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.470 [2024-11-27 06:17:03.933192] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.471 [2024-11-27 06:17:03.933205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.471 [2024-11-27 06:17:03.933259] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.471 [2024-11-27 06:17:03.933273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.730 NEW_FUNC[1/670]: 0x44fe48 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:07:34.730 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:34.730 #3 NEW cov: 11560 ft: 11561 corp: 2/23b lim: 35 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:07:34.730 [2024-11-27 06:17:04.233940] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.730 [2024-11-27 06:17:04.233972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.730 [2024-11-27 06:17:04.234032] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.730 [2024-11-27 06:17:04.234046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.730 [2024-11-27 06:17:04.234106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.730 [2024-11-27 06:17:04.234120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.730 #9 NEW cov: 11673 ft: 12175 corp: 3/45b lim: 35 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 ChangeBinInt- 00:07:34.990 [2024-11-27 06:17:04.284177] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.284204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.284264] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.284278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.284343] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.284356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.284415] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.284429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.990 #10 NEW cov: 11679 ft: 12801 corp: 4/75b lim: 35 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 CopyPart- 00:07:34.990 [2024-11-27 06:17:04.324187] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.324215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.324292] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.324307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.324366] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.324380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.324441] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.324454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.990 #11 NEW cov: 11764 ft: 13058 corp: 5/105b lim: 35 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 CrossOver- 00:07:34.990 [2024-11-27 06:17:04.364323] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.364347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.364413] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.364427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.364485] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.364499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.364559] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.364572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.990 #12 NEW cov: 11764 ft: 13139 corp: 6/135b lim: 35 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 ChangeBit- 00:07:34.990 [2024-11-27 06:17:04.404448] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.404475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.404522] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.404535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.404604] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.404617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.404676] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.404689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.990 #18 NEW cov: 11764 ft: 13319 corp: 7/165b lim: 35 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 ChangeByte- 00:07:34.990 [2024-11-27 06:17:04.444443] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.444469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.444530] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.444544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.444605] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.444619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.990 #19 NEW cov: 11764 ft: 13431 corp: 8/190b lim: 35 exec/s: 0 rss: 68Mb L: 25/30 MS: 1 CopyPart- 00:07:34.990 [2024-11-27 06:17:04.484706] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.484732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.484794] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.484808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.484868] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.484883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.484941] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.484955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.990 #20 NEW cov: 11764 ft: 13503 corp: 9/223b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:34.990 [2024-11-27 06:17:04.524684] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.524710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.990 [2024-11-27 06:17:04.524769] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.990 [2024-11-27 06:17:04.524784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.250 [2024-11-27 06:17:04.524841] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.250 [2024-11-27 06:17:04.524858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.250 #21 NEW cov: 11764 ft: 13543 corp: 10/246b lim: 35 exec/s: 0 rss: 68Mb L: 23/33 MS: 1 InsertByte- 00:07:35.250 [2024-11-27 06:17:04.564793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.250 [2024-11-27 06:17:04.564819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.250 [2024-11-27 06:17:04.564878] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.250 [2024-11-27 06:17:04.564892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.250 [2024-11-27 06:17:04.564951] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.250 [2024-11-27 06:17:04.564965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.250 #22 NEW cov: 11764 ft: 13638 corp: 11/268b lim: 35 exec/s: 0 rss: 68Mb L: 22/33 MS: 1 ShuffleBytes- 00:07:35.250 [2024-11-27 06:17:04.605073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.250 [2024-11-27 06:17:04.605100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.250 [2024-11-27 06:17:04.605161] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.250 [2024-11-27 06:17:04.605175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.250 [2024-11-27 06:17:04.605235] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.250 [2024-11-27 06:17:04.605249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.250 [2024-11-27 06:17:04.605306] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.250 [2024-11-27 06:17:04.605319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.250 #23 NEW cov: 11764 ft: 13654 corp: 12/298b lim: 35 exec/s: 0 rss: 68Mb L: 30/33 MS: 1 ShuffleBytes- 00:07:35.250 [2024-11-27 06:17:04.645356] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.250 [2024-11-27 06:17:04.645381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.250 [2024-11-27 06:17:04.645442] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.250 [2024-11-27 06:17:04.645455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.250 [2024-11-27 06:17:04.645513] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.250 [2024-11-27 06:17:04.645526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.250 [2024-11-27 06:17:04.645585] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.250 [2024-11-27 06:17:04.645602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.251 [2024-11-27 06:17:04.645678] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007eb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.645695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.251 #24 NEW cov: 11764 ft: 13721 corp: 13/333b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:35.251 [2024-11-27 06:17:04.685305] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.685331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.251 [2024-11-27 06:17:04.685392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.685406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.251 [2024-11-27 06:17:04.685464] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.685477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.251 [2024-11-27 06:17:04.685536] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.685550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.251 #25 NEW cov: 11764 ft: 13744 corp: 14/363b lim: 35 exec/s: 0 rss: 68Mb L: 30/35 MS: 1 ChangeByte- 00:07:35.251 [2024-11-27 06:17:04.725422] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.725447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.251 [2024-11-27 06:17:04.725524] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.725537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.251 [2024-11-27 06:17:04.725596] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.725616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.251 [2024-11-27 06:17:04.725673] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.725687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.251 #26 NEW cov: 11764 ft: 13754 corp: 15/393b lim: 35 exec/s: 0 rss: 69Mb L: 30/35 MS: 1 ShuffleBytes- 00:07:35.251 [2024-11-27 06:17:04.765497] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.765524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.251 [2024-11-27 06:17:04.765586] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.765604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.251 [2024-11-27 06:17:04.765664] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.765694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.251 [2024-11-27 06:17:04.765753] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.251 [2024-11-27 06:17:04.765769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.510 #27 NEW cov: 11764 ft: 13808 corp: 16/423b lim: 35 exec/s: 0 rss: 69Mb L: 30/35 MS: 1 ChangeByte- 00:07:35.510 [2024-11-27 06:17:04.805552] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.510 [2024-11-27 06:17:04.805578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.510 [2024-11-27 06:17:04.805639] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.510 [2024-11-27 06:17:04.805653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.510 [2024-11-27 06:17:04.805712] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.510 [2024-11-27 06:17:04.805726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.510 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:35.510 #28 NEW cov: 11787 ft: 13865 corp: 17/445b lim: 35 exec/s: 0 rss: 69Mb L: 22/35 MS: 1 CopyPart- 00:07:35.510 [2024-11-27 06:17:04.845674] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.510 [2024-11-27 06:17:04.845700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.510 [2024-11-27 06:17:04.845776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.510 [2024-11-27 06:17:04.845801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.510 [2024-11-27 06:17:04.845857] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000001c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.845870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.511 #29 NEW cov: 11787 ft: 13878 corp: 18/470b lim: 35 exec/s: 0 rss: 69Mb L: 25/35 MS: 1 ChangeBinInt- 00:07:35.511 [2024-11-27 06:17:04.885796] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.885821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.511 [2024-11-27 06:17:04.885895] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.885910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.511 [2024-11-27 06:17:04.885970] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.885983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.511 #30 NEW cov: 11787 ft: 13976 corp: 19/494b lim: 35 exec/s: 0 rss: 69Mb L: 24/35 MS: 1 CrossOver- 00:07:35.511 [2024-11-27 06:17:04.925892] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.925918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.511 [2024-11-27 06:17:04.925995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.926012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.511 [2024-11-27 06:17:04.926073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.926087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.511 #31 NEW cov: 11787 ft: 14002 corp: 20/516b lim: 35 exec/s: 31 rss: 69Mb L: 22/35 MS: 1 ChangeBit- 00:07:35.511 [2024-11-27 06:17:04.966014] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.966039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.511 [2024-11-27 06:17:04.966097] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.966110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.511 [2024-11-27 06:17:04.966169] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.966183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.511 #32 NEW cov: 11787 ft: 14024 corp: 21/538b lim: 35 exec/s: 32 rss: 69Mb L: 22/35 MS: 1 CopyPart- 00:07:35.511 [2024-11-27 06:17:04.996044] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.996070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.511 [2024-11-27 06:17:04.996145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.996159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.511 [2024-11-27 06:17:04.996205] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:04.996218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.511 #33 NEW cov: 11787 ft: 14090 corp: 22/560b lim: 35 exec/s: 33 rss: 69Mb L: 22/35 MS: 1 EraseBytes- 00:07:35.511 [2024-11-27 06:17:05.036192] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:05.036217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.511 [2024-11-27 06:17:05.036292] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000001b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:05.036306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.511 [2024-11-27 06:17:05.036360] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.511 [2024-11-27 06:17:05.036374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.770 #34 NEW cov: 11787 ft: 14140 corp: 23/582b lim: 35 exec/s: 34 rss: 69Mb L: 22/35 MS: 1 ChangeBinInt- 00:07:35.770 [2024-11-27 06:17:05.076307] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.076332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.076395] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.076408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.076468] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.076481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.771 #35 NEW cov: 11787 ft: 14149 corp: 24/604b lim: 35 exec/s: 35 rss: 69Mb L: 22/35 MS: 1 ChangeByte- 00:07:35.771 [2024-11-27 06:17:05.106668] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.106693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.106780] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.106794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.106852] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.106865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.106924] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.106937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.106994] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.107008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.771 #36 NEW cov: 11787 ft: 14162 corp: 25/639b lim: 35 exec/s: 36 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:07:35.771 [2024-11-27 06:17:05.146405] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.146430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.146503] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.146517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.771 #39 NEW cov: 11787 ft: 14414 corp: 26/659b lim: 35 exec/s: 39 rss: 69Mb L: 20/35 MS: 3 InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:35.771 [2024-11-27 06:17:05.186843] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.186868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.186944] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.186959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.187018] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.187035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.187091] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.187105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.771 #40 NEW cov: 11787 ft: 14435 corp: 27/689b lim: 35 exec/s: 40 rss: 69Mb L: 30/35 MS: 1 CopyPart- 00:07:35.771 [2024-11-27 06:17:05.226903] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.226929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.226989] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.227003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.227061] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.227074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.227133] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.227146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.771 #41 NEW cov: 11787 ft: 14451 corp: 28/719b lim: 35 exec/s: 41 rss: 69Mb L: 30/35 MS: 1 ChangeBinInt- 00:07:35.771 [2024-11-27 06:17:05.266908] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.266932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.266994] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.267008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.771 [2024-11-27 06:17:05.267066] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.771 [2024-11-27 06:17:05.267079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.771 #42 NEW cov: 11787 ft: 14465 corp: 29/742b lim: 35 exec/s: 42 rss: 69Mb L: 23/35 MS: 1 ChangeByte- 00:07:36.030 [2024-11-27 06:17:05.306907] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.306932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.306993] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.307007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.031 #43 NEW cov: 11787 ft: 14480 corp: 30/757b lim: 35 exec/s: 43 rss: 70Mb L: 15/35 MS: 1 EraseBytes- 00:07:36.031 [2024-11-27 06:17:05.347106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.347131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.347197] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.347210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.347272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.347285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.031 #44 NEW cov: 11787 ft: 14483 corp: 31/780b lim: 35 exec/s: 44 rss: 70Mb L: 23/35 MS: 1 ShuffleBytes- 00:07:36.031 [2024-11-27 06:17:05.387218] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.387243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.387321] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.387335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.387396] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.387409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.031 #45 NEW cov: 11787 ft: 14490 corp: 32/803b lim: 35 exec/s: 45 rss: 70Mb L: 23/35 MS: 1 EraseBytes- 00:07:36.031 [2024-11-27 06:17:05.427499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.427525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.427605] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.427620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.427681] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.427694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.427764] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.427778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.031 #46 NEW cov: 11787 ft: 14552 corp: 33/833b lim: 35 exec/s: 46 rss: 70Mb L: 30/35 MS: 1 CrossOver- 00:07:36.031 [2024-11-27 06:17:05.467409] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.467435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.467514] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.467528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.467588] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.467607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.031 #47 NEW cov: 11787 ft: 14563 corp: 34/855b lim: 35 exec/s: 47 rss: 70Mb L: 22/35 MS: 1 ShuffleBytes- 00:07:36.031 [2024-11-27 06:17:05.507658] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.507683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.507769] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.507783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.507841] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.507854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.507914] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.507927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.031 #48 NEW cov: 11787 ft: 14579 corp: 35/885b lim: 35 exec/s: 48 rss: 70Mb L: 30/35 MS: 1 CopyPart- 00:07:36.031 [2024-11-27 06:17:05.547695] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.547720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.547779] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.547793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.031 [2024-11-27 06:17:05.547853] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.031 [2024-11-27 06:17:05.547866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.291 #49 NEW cov: 11787 ft: 14589 corp: 36/906b lim: 35 exec/s: 49 rss: 70Mb L: 21/35 MS: 1 EraseBytes- 00:07:36.291 [2024-11-27 06:17:05.587921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.587948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.588023] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.588037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.588099] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.588113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.588175] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.588188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.291 #50 NEW cov: 11787 ft: 14611 corp: 37/936b lim: 35 exec/s: 50 rss: 70Mb L: 30/35 MS: 1 CrossOver- 00:07:36.291 [2024-11-27 06:17:05.628186] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.628214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.628292] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.628306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.628367] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.628380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.628440] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.628453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.628512] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.628526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.291 #51 NEW cov: 11787 ft: 14626 corp: 38/971b lim: 35 exec/s: 51 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:07:36.291 [2024-11-27 06:17:05.668271] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.668297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.668358] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.668372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.668431] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.668445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.668506] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.668519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.668578] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.668592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.291 #52 NEW cov: 11787 ft: 14637 corp: 39/1006b lim: 35 exec/s: 52 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:07:36.291 [2024-11-27 06:17:05.708370] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.708395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.708457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.708471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.708532] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.708549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.708615] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.708629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.708690] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.708703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.291 #53 NEW cov: 11787 ft: 14679 corp: 40/1041b lim: 35 exec/s: 53 rss: 70Mb L: 35/35 MS: 1 ChangeBit- 00:07:36.291 [2024-11-27 06:17:05.748386] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.748411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.748475] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.748488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.748549] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.748562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.748624] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.748637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.291 #54 NEW cov: 11787 ft: 14685 corp: 41/1074b lim: 35 exec/s: 54 rss: 70Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:07:36.291 [2024-11-27 06:17:05.788384] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.788409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.788486] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.788500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.291 [2024-11-27 06:17:05.788562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.291 [2024-11-27 06:17:05.788575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.291 #55 NEW cov: 11787 ft: 14703 corp: 42/1098b lim: 35 exec/s: 55 rss: 70Mb L: 24/35 MS: 1 ChangeBinInt- 00:07:36.551 [2024-11-27 06:17:05.828804] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.551 [2024-11-27 06:17:05.828829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.551 [2024-11-27 06:17:05.828893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.551 [2024-11-27 06:17:05.828907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.551 [2024-11-27 06:17:05.828970] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.551 [2024-11-27 06:17:05.828986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.551 [2024-11-27 06:17:05.829048] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.551 [2024-11-27 06:17:05.829061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.551 [2024-11-27 06:17:05.829121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.551 [2024-11-27 06:17:05.829135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.551 #56 NEW cov: 11787 ft: 14743 corp: 43/1133b lim: 35 exec/s: 56 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:07:36.552 [2024-11-27 06:17:05.868500] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.552 [2024-11-27 06:17:05.868525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.552 [2024-11-27 06:17:05.868587] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.552 [2024-11-27 06:17:05.868606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.552 #57 NEW cov: 11787 ft: 14769 corp: 44/1153b lim: 35 exec/s: 57 rss: 70Mb L: 20/35 MS: 1 ChangeBit- 00:07:36.552 [2024-11-27 06:17:05.908737] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.552 [2024-11-27 06:17:05.908762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.552 [2024-11-27 06:17:05.908818] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.552 [2024-11-27 06:17:05.908831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.552 [2024-11-27 06:17:05.908893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.552 [2024-11-27 06:17:05.908905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.552 #58 NEW cov: 11787 ft: 14785 corp: 45/1175b lim: 35 exec/s: 29 rss: 70Mb L: 22/35 MS: 1 CopyPart- 00:07:36.552 #58 DONE cov: 11787 ft: 14785 corp: 45/1175b lim: 35 exec/s: 29 rss: 70Mb 00:07:36.552 Done 58 runs in 2 second(s) 00:07:36.552 06:17:06 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:07:36.552 06:17:06 -- ../common.sh@72 -- # (( i++ )) 00:07:36.552 06:17:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:36.552 06:17:06 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:07:36.552 06:17:06 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:07:36.552 06:17:06 -- nvmf/run.sh@24 -- # local timen=1 00:07:36.552 06:17:06 -- nvmf/run.sh@25 -- # local core=0x1 00:07:36.552 06:17:06 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:36.552 06:17:06 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:07:36.552 06:17:06 -- nvmf/run.sh@29 -- # printf %02d 16 00:07:36.552 06:17:06 -- nvmf/run.sh@29 -- # port=4416 00:07:36.552 06:17:06 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:36.552 06:17:06 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:07:36.552 06:17:06 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:36.552 06:17:06 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:07:36.812 [2024-11-27 06:17:06.088734] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:36.812 [2024-11-27 06:17:06.088828] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid36055 ] 00:07:36.812 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.812 [2024-11-27 06:17:06.269796] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.812 [2024-11-27 06:17:06.334193] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:36.812 [2024-11-27 06:17:06.334336] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.071 [2024-11-27 06:17:06.392413] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:37.071 [2024-11-27 06:17:06.408791] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:07:37.071 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.071 INFO: Seed: 3519469663 00:07:37.071 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:37.071 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:37.071 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:37.071 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.071 #2 INITED exec/s: 0 rss: 60Mb 00:07:37.071 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.071 This may also happen if the target rejected all inputs we tried so far 00:07:37.071 [2024-11-27 06:17:06.454208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.071 [2024-11-27 06:17:06.454239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.071 [2024-11-27 06:17:06.454278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.071 [2024-11-27 06:17:06.454294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.071 [2024-11-27 06:17:06.454350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.071 [2024-11-27 06:17:06.454365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.071 [2024-11-27 06:17:06.454422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.071 [2024-11-27 06:17:06.454437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.330 NEW_FUNC[1/671]: 0x451308 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:07:37.330 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:37.330 #4 NEW cov: 11663 ft: 11664 corp: 2/87b lim: 105 exec/s: 0 rss: 68Mb L: 86/86 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:37.330 [2024-11-27 06:17:06.754655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.330 [2024-11-27 06:17:06.754689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.330 [2024-11-27 06:17:06.754743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.330 [2024-11-27 06:17:06.754761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.330 #13 NEW cov: 11776 ft: 12810 corp: 3/149b lim: 105 exec/s: 0 rss: 68Mb L: 62/86 MS: 4 ChangeBit-InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:37.330 [2024-11-27 06:17:06.794696] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.330 [2024-11-27 06:17:06.794725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.330 [2024-11-27 06:17:06.794782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.330 [2024-11-27 06:17:06.794797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.330 #14 NEW cov: 11782 ft: 13030 corp: 4/211b lim: 105 exec/s: 0 rss: 68Mb L: 62/86 MS: 1 ChangeByte- 00:07:37.330 [2024-11-27 06:17:06.834841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.330 [2024-11-27 06:17:06.834869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.330 [2024-11-27 06:17:06.834924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.330 [2024-11-27 06:17:06.834939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.330 #20 NEW cov: 11867 ft: 13280 corp: 5/255b lim: 105 exec/s: 0 rss: 68Mb L: 44/86 MS: 1 EraseBytes- 00:07:37.589 [2024-11-27 06:17:06.875173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:06.875200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:06.875241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:06.875257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:06.875312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:06.875328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:06.875383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:06.875398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.589 #21 NEW cov: 11867 ft: 13400 corp: 6/341b lim: 105 exec/s: 0 rss: 68Mb L: 86/86 MS: 1 ChangeByte- 00:07:37.589 [2024-11-27 06:17:06.915317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:06.915345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:06.915394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:06.915409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.589 #22 NEW cov: 11867 ft: 13581 corp: 7/401b lim: 105 exec/s: 0 rss: 68Mb L: 60/86 MS: 1 EraseBytes- 00:07:37.589 [2024-11-27 06:17:06.955177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18158513694051401727 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:06.955204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:06.955244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:06.955259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.589 #23 NEW cov: 11867 ft: 13703 corp: 8/463b lim: 105 exec/s: 0 rss: 68Mb L: 62/86 MS: 1 ChangeBit- 00:07:37.589 [2024-11-27 06:17:06.995251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:06.995278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:06.995332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744070169559039 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:06.995348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.589 #24 NEW cov: 11867 ft: 13815 corp: 9/525b lim: 105 exec/s: 0 rss: 68Mb L: 62/86 MS: 1 ChangeByte- 00:07:37.589 [2024-11-27 06:17:07.035632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:07.035660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:07.035713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446743695752429567 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:07.035728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:07.035783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:07.035814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:07.035868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:07.035883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.589 #25 NEW cov: 11867 ft: 13881 corp: 10/618b lim: 105 exec/s: 0 rss: 68Mb L: 93/93 MS: 1 InsertRepeatedBytes- 00:07:37.589 [2024-11-27 06:17:07.075511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:07.075539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:07.075593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:07.075616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.589 #26 NEW cov: 11867 ft: 13922 corp: 11/667b lim: 105 exec/s: 0 rss: 68Mb L: 49/93 MS: 1 CrossOver- 00:07:37.589 [2024-11-27 06:17:07.115836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:07.115864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:07.115912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:07.115928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:07.115983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:07.115998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.589 [2024-11-27 06:17:07.116052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.589 [2024-11-27 06:17:07.116068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.848 #27 NEW cov: 11867 ft: 13983 corp: 12/758b lim: 105 exec/s: 0 rss: 69Mb L: 91/93 MS: 1 CopyPart- 00:07:37.848 [2024-11-27 06:17:07.155986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.156014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.156063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.156079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.156134] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.156150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.156205] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.156220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.848 #28 NEW cov: 11867 ft: 14081 corp: 13/849b lim: 105 exec/s: 0 rss: 69Mb L: 91/93 MS: 1 ChangeBinInt- 00:07:37.848 [2024-11-27 06:17:07.196132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.196159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.196209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.196225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.196280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.196295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.196350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18387915803577024511 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.196369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.848 #29 NEW cov: 11867 ft: 14105 corp: 14/945b lim: 105 exec/s: 0 rss: 69Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:07:37.848 [2024-11-27 06:17:07.236229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.236256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.236305] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.236321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.236374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.236389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.236446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.236461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.848 #30 NEW cov: 11867 ft: 14164 corp: 15/1031b lim: 105 exec/s: 0 rss: 69Mb L: 86/96 MS: 1 ChangeBit- 00:07:37.848 [2024-11-27 06:17:07.276315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.276342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.276392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.276407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.276462] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.276477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.276531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.276546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.848 #32 NEW cov: 11867 ft: 14240 corp: 16/1132b lim: 105 exec/s: 0 rss: 69Mb L: 101/101 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:37.848 [2024-11-27 06:17:07.316295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.316322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.316370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446743695752429567 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.316386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.316443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.316474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.848 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:37.848 #33 NEW cov: 11890 ft: 14560 corp: 17/1204b lim: 105 exec/s: 0 rss: 69Mb L: 72/101 MS: 1 CrossOver- 00:07:37.848 [2024-11-27 06:17:07.366497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743795325206527 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.366524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.366576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446743695752429567 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.366592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.848 [2024-11-27 06:17:07.366653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.848 [2024-11-27 06:17:07.366669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.107 #34 NEW cov: 11890 ft: 14589 corp: 18/1276b lim: 105 exec/s: 0 rss: 69Mb L: 72/101 MS: 1 ChangeBit- 00:07:38.107 [2024-11-27 06:17:07.406378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18158513694051401727 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.107 [2024-11-27 06:17:07.406405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.107 #35 NEW cov: 11890 ft: 15016 corp: 19/1307b lim: 105 exec/s: 0 rss: 69Mb L: 31/101 MS: 1 EraseBytes- 00:07:38.107 [2024-11-27 06:17:07.446740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.107 [2024-11-27 06:17:07.446767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.107 [2024-11-27 06:17:07.446809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.107 [2024-11-27 06:17:07.446825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.107 [2024-11-27 06:17:07.446880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.107 [2024-11-27 06:17:07.446895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.107 #36 NEW cov: 11890 ft: 15033 corp: 20/1372b lim: 105 exec/s: 36 rss: 69Mb L: 65/101 MS: 1 EraseBytes- 00:07:38.107 [2024-11-27 06:17:07.486840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.107 [2024-11-27 06:17:07.486868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.108 [2024-11-27 06:17:07.486922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744070169559039 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.108 [2024-11-27 06:17:07.486938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.108 [2024-11-27 06:17:07.486992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.108 [2024-11-27 06:17:07.487011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.108 #37 NEW cov: 11890 ft: 15068 corp: 21/1445b lim: 105 exec/s: 37 rss: 69Mb L: 73/101 MS: 1 CopyPart- 00:07:38.108 [2024-11-27 06:17:07.526978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.108 [2024-11-27 06:17:07.527006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.108 [2024-11-27 06:17:07.527044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.108 [2024-11-27 06:17:07.527059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.108 [2024-11-27 06:17:07.527115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.108 [2024-11-27 06:17:07.527129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.108 #38 NEW cov: 11890 ft: 15084 corp: 22/1518b lim: 105 exec/s: 38 rss: 69Mb L: 73/101 MS: 1 InsertRepeatedBytes- 00:07:38.108 [2024-11-27 06:17:07.567053] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.108 [2024-11-27 06:17:07.567082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.108 [2024-11-27 06:17:07.567116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.108 [2024-11-27 06:17:07.567131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.108 [2024-11-27 06:17:07.567184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.108 [2024-11-27 06:17:07.567201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.108 #39 NEW cov: 11890 ft: 15105 corp: 23/1591b lim: 105 exec/s: 39 rss: 69Mb L: 73/101 MS: 1 InsertByte- 00:07:38.108 [2024-11-27 06:17:07.607287] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:184549120 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.108 [2024-11-27 06:17:07.607315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.108 [2024-11-27 06:17:07.607366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.108 [2024-11-27 06:17:07.607382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.108 [2024-11-27 06:17:07.607437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.108 [2024-11-27 06:17:07.607452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.108 [2024-11-27 06:17:07.607508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.108 [2024-11-27 06:17:07.607523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.108 #40 NEW cov: 11890 ft: 15118 corp: 24/1682b lim: 105 exec/s: 40 rss: 69Mb L: 91/101 MS: 1 EraseBytes- 00:07:38.366 [2024-11-27 06:17:07.647443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.366 [2024-11-27 06:17:07.647471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.366 [2024-11-27 06:17:07.647520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.366 [2024-11-27 06:17:07.647537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.366 [2024-11-27 06:17:07.647592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:63488 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.366 [2024-11-27 06:17:07.647613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.366 [2024-11-27 06:17:07.647666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.366 [2024-11-27 06:17:07.647681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.366 #41 NEW cov: 11890 ft: 15136 corp: 25/1769b lim: 105 exec/s: 41 rss: 69Mb L: 87/101 MS: 1 InsertByte- 00:07:38.366 [2024-11-27 06:17:07.687286] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18158513694051401727 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.366 [2024-11-27 06:17:07.687314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.366 [2024-11-27 06:17:07.687376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.366 [2024-11-27 06:17:07.687392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.366 #42 NEW cov: 11890 ft: 15142 corp: 26/1824b lim: 105 exec/s: 42 rss: 69Mb L: 55/101 MS: 1 EraseBytes- 00:07:38.366 [2024-11-27 06:17:07.727532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.366 [2024-11-27 06:17:07.727559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.366 [2024-11-27 06:17:07.727617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.366 [2024-11-27 06:17:07.727632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.366 [2024-11-27 06:17:07.727689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12008751269904033446 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.366 [2024-11-27 06:17:07.727705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.366 #43 NEW cov: 11890 ft: 15184 corp: 27/1902b lim: 105 exec/s: 43 rss: 70Mb L: 78/101 MS: 1 InsertRepeatedBytes- 00:07:38.366 [2024-11-27 06:17:07.767672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.366 [2024-11-27 06:17:07.767700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.366 [2024-11-27 06:17:07.767747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.366 [2024-11-27 06:17:07.767763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.366 [2024-11-27 06:17:07.767820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:59304 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.367 [2024-11-27 06:17:07.767835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.367 #44 NEW cov: 11890 ft: 15195 corp: 28/1975b lim: 105 exec/s: 44 rss: 70Mb L: 73/101 MS: 1 ChangeBit- 00:07:38.367 [2024-11-27 06:17:07.807904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.367 [2024-11-27 06:17:07.807931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.367 [2024-11-27 06:17:07.807980] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.367 [2024-11-27 06:17:07.807996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.367 [2024-11-27 06:17:07.808050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:63488 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.367 [2024-11-27 06:17:07.808065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.367 [2024-11-27 06:17:07.808119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.367 [2024-11-27 06:17:07.808134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.367 #45 NEW cov: 11890 ft: 15219 corp: 29/2062b lim: 105 exec/s: 45 rss: 70Mb L: 87/101 MS: 1 ChangeBit- 00:07:38.367 [2024-11-27 06:17:07.847793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.367 [2024-11-27 06:17:07.847820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.367 [2024-11-27 06:17:07.847888] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.367 [2024-11-27 06:17:07.847904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.367 #46 NEW cov: 11890 ft: 15234 corp: 30/2106b lim: 105 exec/s: 46 rss: 70Mb L: 44/101 MS: 1 ShuffleBytes- 00:07:38.367 [2024-11-27 06:17:07.888149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.367 [2024-11-27 06:17:07.888177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.367 [2024-11-27 06:17:07.888224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.367 [2024-11-27 06:17:07.888240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.367 [2024-11-27 06:17:07.888295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:3283687080239334544 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.367 [2024-11-27 06:17:07.888309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.367 [2024-11-27 06:17:07.888365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18387915803577024511 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.367 [2024-11-27 06:17:07.888384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.625 #47 NEW cov: 11890 ft: 15248 corp: 31/2202b lim: 105 exec/s: 47 rss: 70Mb L: 96/101 MS: 1 CMP- DE: "_s4\204\220-\222\000"- 00:07:38.625 [2024-11-27 06:17:07.928237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:07.928266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:07.928327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:07.928343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:07.928399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:07.928415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:07.928470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:07.928486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.625 #48 NEW cov: 11890 ft: 15280 corp: 32/2288b lim: 105 exec/s: 48 rss: 70Mb L: 86/101 MS: 1 ChangeBit- 00:07:38.625 [2024-11-27 06:17:07.968355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:07.968383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:07.968447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:07.968463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:07.968518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6340503117737492224 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:07.968534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:07.968590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:07.968612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.625 #49 NEW cov: 11890 ft: 15285 corp: 33/2389b lim: 105 exec/s: 49 rss: 70Mb L: 101/101 MS: 1 CMP- DE: "\377\377\000W"- 00:07:38.625 [2024-11-27 06:17:08.008362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743795325206527 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.008390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:08.008443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446743695752429567 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.008459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:08.008515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.008534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.625 #50 NEW cov: 11890 ft: 15300 corp: 34/2461b lim: 105 exec/s: 50 rss: 70Mb L: 72/101 MS: 1 ShuffleBytes- 00:07:38.625 [2024-11-27 06:17:08.048603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.048631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:08.048687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.048701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:08.048759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6877898801726488061 len:37377 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.048775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:08.048829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.048845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.625 #51 NEW cov: 11890 ft: 15317 corp: 35/2562b lim: 105 exec/s: 51 rss: 70Mb L: 101/101 MS: 1 PersAutoDict- DE: "_s4\204\220-\222\000"- 00:07:38.625 [2024-11-27 06:17:08.088612] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.088640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:08.088690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.088705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:08.088759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18302063728033398269 len:65022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.088776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.625 #52 NEW cov: 11890 ft: 15328 corp: 36/2636b lim: 105 exec/s: 52 rss: 70Mb L: 74/101 MS: 1 EraseBytes- 00:07:38.625 [2024-11-27 06:17:08.128805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.128831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:08.128888] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4294967040 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.128902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:08.128972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12080808865440989183 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.128987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.625 [2024-11-27 06:17:08.129042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:12080808863942027175 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.625 [2024-11-27 06:17:08.129060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.625 #53 NEW cov: 11890 ft: 15369 corp: 37/2731b lim: 105 exec/s: 53 rss: 70Mb L: 95/101 MS: 1 InsertRepeatedBytes- 00:07:38.884 [2024-11-27 06:17:08.168736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.168763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.884 [2024-11-27 06:17:08.168803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.168818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.884 #54 NEW cov: 11890 ft: 15380 corp: 38/2775b lim: 105 exec/s: 54 rss: 70Mb L: 44/101 MS: 1 ChangeBit- 00:07:38.884 [2024-11-27 06:17:08.208955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743795325206527 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.208983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.884 [2024-11-27 06:17:08.209038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446743695752429567 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.209054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.884 [2024-11-27 06:17:08.209110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.209126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.884 #55 NEW cov: 11890 ft: 15386 corp: 39/2847b lim: 105 exec/s: 55 rss: 70Mb L: 72/101 MS: 1 ChangeBit- 00:07:38.884 [2024-11-27 06:17:08.249169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.249196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.884 [2024-11-27 06:17:08.249261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446743695752429567 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.249277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.884 [2024-11-27 06:17:08.249331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.249347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.884 [2024-11-27 06:17:08.249401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.249417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.884 #56 NEW cov: 11890 ft: 15399 corp: 40/2940b lim: 105 exec/s: 56 rss: 70Mb L: 93/101 MS: 1 InsertRepeatedBytes- 00:07:38.884 [2024-11-27 06:17:08.289169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743713720827903 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.289203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.884 [2024-11-27 06:17:08.289242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.289258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.884 [2024-11-27 06:17:08.289312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.289327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.884 #57 NEW cov: 11890 ft: 15402 corp: 41/3014b lim: 105 exec/s: 57 rss: 70Mb L: 74/101 MS: 1 InsertByte- 00:07:38.884 [2024-11-27 06:17:08.329412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.329439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.884 [2024-11-27 06:17:08.329488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:651061555542690057 len:2314 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.329504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.884 [2024-11-27 06:17:08.329557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.329572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.884 [2024-11-27 06:17:08.329647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.329663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.884 #58 NEW cov: 11890 ft: 15434 corp: 42/3109b lim: 105 exec/s: 58 rss: 70Mb L: 95/101 MS: 1 InsertRepeatedBytes- 00:07:38.884 [2024-11-27 06:17:08.369280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070203113471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.884 [2024-11-27 06:17:08.369308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.885 [2024-11-27 06:17:08.369345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.885 [2024-11-27 06:17:08.369359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.885 #59 NEW cov: 11890 ft: 15446 corp: 43/3153b lim: 105 exec/s: 59 rss: 70Mb L: 44/101 MS: 1 ShuffleBytes- 00:07:38.885 [2024-11-27 06:17:08.409536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743795325206527 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.885 [2024-11-27 06:17:08.409564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.885 [2024-11-27 06:17:08.409605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446743695752429567 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.885 [2024-11-27 06:17:08.409621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.885 [2024-11-27 06:17:08.409674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.885 [2024-11-27 06:17:08.409709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:39.143 #60 NEW cov: 11890 ft: 15460 corp: 44/3225b lim: 105 exec/s: 60 rss: 70Mb L: 72/101 MS: 1 CMP- DE: "\002\000\000\000"- 00:07:39.143 [2024-11-27 06:17:08.449517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.143 [2024-11-27 06:17:08.449545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.143 [2024-11-27 06:17:08.449605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073694347263 len:2048 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.143 [2024-11-27 06:17:08.449620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.143 #61 NEW cov: 11890 ft: 15466 corp: 45/3286b lim: 105 exec/s: 30 rss: 70Mb L: 61/101 MS: 1 InsertByte- 00:07:39.143 #61 DONE cov: 11890 ft: 15466 corp: 45/3286b lim: 105 exec/s: 30 rss: 70Mb 00:07:39.143 ###### Recommended dictionary. ###### 00:07:39.143 "_s4\204\220-\222\000" # Uses: 1 00:07:39.143 "\377\377\000W" # Uses: 0 00:07:39.143 "\002\000\000\000" # Uses: 0 00:07:39.143 ###### End of recommended dictionary. ###### 00:07:39.143 Done 61 runs in 2 second(s) 00:07:39.143 06:17:08 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:07:39.143 06:17:08 -- ../common.sh@72 -- # (( i++ )) 00:07:39.143 06:17:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:39.143 06:17:08 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:07:39.143 06:17:08 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:07:39.143 06:17:08 -- nvmf/run.sh@24 -- # local timen=1 00:07:39.143 06:17:08 -- nvmf/run.sh@25 -- # local core=0x1 00:07:39.143 06:17:08 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:39.143 06:17:08 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:07:39.143 06:17:08 -- nvmf/run.sh@29 -- # printf %02d 17 00:07:39.143 06:17:08 -- nvmf/run.sh@29 -- # port=4417 00:07:39.143 06:17:08 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:39.143 06:17:08 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:07:39.143 06:17:08 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:39.143 06:17:08 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:07:39.143 [2024-11-27 06:17:08.632085] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:39.143 [2024-11-27 06:17:08.632166] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid36577 ] 00:07:39.143 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.401 [2024-11-27 06:17:08.809903] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.401 [2024-11-27 06:17:08.873688] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:39.401 [2024-11-27 06:17:08.873831] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.401 [2024-11-27 06:17:08.931804] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:39.660 [2024-11-27 06:17:08.948153] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:07:39.660 INFO: Running with entropic power schedule (0xFF, 100). 00:07:39.660 INFO: Seed: 1763496900 00:07:39.660 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:39.660 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:39.660 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:39.660 INFO: A corpus is not provided, starting from an empty corpus 00:07:39.660 #2 INITED exec/s: 0 rss: 60Mb 00:07:39.660 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:39.660 This may also happen if the target rejected all inputs we tried so far 00:07:39.660 [2024-11-27 06:17:09.003625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.660 [2024-11-27 06:17:09.003657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.660 [2024-11-27 06:17:09.003709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.660 [2024-11-27 06:17:09.003726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.660 [2024-11-27 06:17:09.003778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.660 [2024-11-27 06:17:09.003794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:39.660 [2024-11-27 06:17:09.003848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.660 [2024-11-27 06:17:09.003864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:39.917 NEW_FUNC[1/669]: 0x4545f8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:07:39.917 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:39.917 #11 NEW cov: 11660 ft: 11635 corp: 2/113b lim: 120 exec/s: 0 rss: 68Mb L: 112/112 MS: 4 ChangeBit-InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:39.917 [2024-11-27 06:17:09.324075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.917 [2024-11-27 06:17:09.324129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.917 NEW_FUNC[1/3]: 0x151e658 in nvme_ctrlr_process_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3790 00:07:39.917 NEW_FUNC[2/3]: 0x16ee128 in spdk_nvme_probe_poll_async /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme.c:1507 00:07:39.917 #19 NEW cov: 11797 ft: 13184 corp: 3/159b lim: 120 exec/s: 0 rss: 68Mb L: 46/112 MS: 3 CrossOver-CrossOver-InsertRepeatedBytes- 00:07:39.917 [2024-11-27 06:17:09.374178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.917 [2024-11-27 06:17:09.374208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.917 [2024-11-27 06:17:09.374277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.917 [2024-11-27 06:17:09.374292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.917 #20 NEW cov: 11803 ft: 13782 corp: 4/217b lim: 120 exec/s: 0 rss: 68Mb L: 58/112 MS: 1 CrossOver- 00:07:39.917 [2024-11-27 06:17:09.414164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.917 [2024-11-27 06:17:09.414194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.917 #24 NEW cov: 11888 ft: 14057 corp: 5/264b lim: 120 exec/s: 0 rss: 68Mb L: 47/112 MS: 4 CrossOver-EraseBytes-InsertByte-CrossOver- 00:07:40.175 [2024-11-27 06:17:09.454268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.175 [2024-11-27 06:17:09.454298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.175 #25 NEW cov: 11888 ft: 14336 corp: 6/306b lim: 120 exec/s: 0 rss: 68Mb L: 42/112 MS: 1 EraseBytes- 00:07:40.175 [2024-11-27 06:17:09.494531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.175 [2024-11-27 06:17:09.494560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.175 [2024-11-27 06:17:09.494616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.175 [2024-11-27 06:17:09.494632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.175 #26 NEW cov: 11888 ft: 14417 corp: 7/367b lim: 120 exec/s: 0 rss: 68Mb L: 61/112 MS: 1 CrossOver- 00:07:40.175 [2024-11-27 06:17:09.534505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.175 [2024-11-27 06:17:09.534534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.175 #27 NEW cov: 11888 ft: 14545 corp: 8/411b lim: 120 exec/s: 0 rss: 68Mb L: 44/112 MS: 1 EraseBytes- 00:07:40.175 [2024-11-27 06:17:09.575060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.175 [2024-11-27 06:17:09.575090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.175 [2024-11-27 06:17:09.575127] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.175 [2024-11-27 06:17:09.575143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.175 [2024-11-27 06:17:09.575192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.175 [2024-11-27 06:17:09.575208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.175 [2024-11-27 06:17:09.575260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.175 [2024-11-27 06:17:09.575276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.175 #30 NEW cov: 11888 ft: 14574 corp: 9/510b lim: 120 exec/s: 0 rss: 68Mb L: 99/112 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:07:40.175 [2024-11-27 06:17:09.615178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.175 [2024-11-27 06:17:09.615207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.175 [2024-11-27 06:17:09.615262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744070219890687 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.175 [2024-11-27 06:17:09.615277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.175 [2024-11-27 06:17:09.615329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.175 [2024-11-27 06:17:09.615347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.175 [2024-11-27 06:17:09.615400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.175 [2024-11-27 06:17:09.615415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.175 #31 NEW cov: 11888 ft: 14699 corp: 10/610b lim: 120 exec/s: 0 rss: 68Mb L: 100/112 MS: 1 InsertByte- 00:07:40.176 [2024-11-27 06:17:09.654869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.176 [2024-11-27 06:17:09.654897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.176 #32 NEW cov: 11888 ft: 14809 corp: 11/654b lim: 120 exec/s: 0 rss: 69Mb L: 44/112 MS: 1 CopyPart- 00:07:40.176 [2024-11-27 06:17:09.695125] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.176 [2024-11-27 06:17:09.695153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.176 [2024-11-27 06:17:09.695204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.176 [2024-11-27 06:17:09.695219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.436 #33 NEW cov: 11888 ft: 14841 corp: 12/702b lim: 120 exec/s: 0 rss: 69Mb L: 48/112 MS: 1 InsertByte- 00:07:40.436 [2024-11-27 06:17:09.735375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.735402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.436 [2024-11-27 06:17:09.735439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.735453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.436 [2024-11-27 06:17:09.735506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.735520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.436 #34 NEW cov: 11888 ft: 15148 corp: 13/784b lim: 120 exec/s: 0 rss: 69Mb L: 82/112 MS: 1 CopyPart- 00:07:40.436 [2024-11-27 06:17:09.775356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.775384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.436 [2024-11-27 06:17:09.775422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:726216478147873290 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.775437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.436 #35 NEW cov: 11888 ft: 15167 corp: 14/832b lim: 120 exec/s: 0 rss: 69Mb L: 48/112 MS: 1 ChangeBinInt- 00:07:40.436 [2024-11-27 06:17:09.815773] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.815801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.436 [2024-11-27 06:17:09.815840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744070219890687 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.815855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.436 [2024-11-27 06:17:09.815907] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.815921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.436 [2024-11-27 06:17:09.815972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.815986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.436 #36 NEW cov: 11888 ft: 15186 corp: 15/951b lim: 120 exec/s: 0 rss: 69Mb L: 119/119 MS: 1 InsertRepeatedBytes- 00:07:40.436 [2024-11-27 06:17:09.855606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.855633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.436 [2024-11-27 06:17:09.855694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.855710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.436 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:40.436 #37 NEW cov: 11911 ft: 15213 corp: 16/1012b lim: 120 exec/s: 0 rss: 69Mb L: 61/119 MS: 1 ShuffleBytes- 00:07:40.436 [2024-11-27 06:17:09.905877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2607 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.905904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.436 [2024-11-27 06:17:09.905941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1443977668760046090 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.905956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.436 [2024-11-27 06:17:09.905996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:723401728380766742 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.906012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.436 #38 NEW cov: 11911 ft: 15228 corp: 17/1095b lim: 120 exec/s: 0 rss: 69Mb L: 83/119 MS: 1 CopyPart- 00:07:40.436 [2024-11-27 06:17:09.946006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.946034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.436 [2024-11-27 06:17:09.946071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.946085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.436 [2024-11-27 06:17:09.946139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.436 [2024-11-27 06:17:09.946158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.696 #39 NEW cov: 11911 ft: 15241 corp: 18/1171b lim: 120 exec/s: 0 rss: 69Mb L: 76/119 MS: 1 InsertRepeatedBytes- 00:07:40.696 [2024-11-27 06:17:09.986272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:09.986300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.696 [2024-11-27 06:17:09.986332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744070219890687 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:09.986346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.696 [2024-11-27 06:17:09.986397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:09.986412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.696 [2024-11-27 06:17:09.986463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:09.986478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.696 #40 NEW cov: 11911 ft: 15258 corp: 19/1272b lim: 120 exec/s: 40 rss: 69Mb L: 101/119 MS: 1 InsertByte- 00:07:40.696 [2024-11-27 06:17:10.026135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728384895498 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:10.026165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.696 [2024-11-27 06:17:10.026217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723412723497044490 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:10.026233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.696 #41 NEW cov: 11911 ft: 15286 corp: 20/1321b lim: 120 exec/s: 41 rss: 69Mb L: 49/119 MS: 1 InsertByte- 00:07:40.696 [2024-11-27 06:17:10.066237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:10.066265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.696 [2024-11-27 06:17:10.066302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:10.066317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.696 #42 NEW cov: 11911 ft: 15339 corp: 21/1379b lim: 120 exec/s: 42 rss: 69Mb L: 58/119 MS: 1 CopyPart- 00:07:40.696 [2024-11-27 06:17:10.106223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:10.106253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.696 #43 NEW cov: 11911 ft: 15370 corp: 22/1426b lim: 120 exec/s: 43 rss: 69Mb L: 47/119 MS: 1 InsertByte- 00:07:40.696 [2024-11-27 06:17:10.156510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:10.156539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.696 [2024-11-27 06:17:10.156603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:10.156618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.696 #44 NEW cov: 11911 ft: 15376 corp: 23/1484b lim: 120 exec/s: 44 rss: 69Mb L: 58/119 MS: 1 ShuffleBytes- 00:07:40.696 [2024-11-27 06:17:10.196590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:10.196623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.696 [2024-11-27 06:17:10.196661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.696 [2024-11-27 06:17:10.196677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.696 #45 NEW cov: 11911 ft: 15398 corp: 24/1541b lim: 120 exec/s: 45 rss: 70Mb L: 57/119 MS: 1 EraseBytes- 00:07:40.956 [2024-11-27 06:17:10.236763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.236791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.236842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.236858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.956 #46 NEW cov: 11911 ft: 15450 corp: 25/1596b lim: 120 exec/s: 46 rss: 70Mb L: 55/119 MS: 1 CopyPart- 00:07:40.956 [2024-11-27 06:17:10.276992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.277020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.277066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.277082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.277145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.277160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.956 #47 NEW cov: 11911 ft: 15502 corp: 26/1674b lim: 120 exec/s: 47 rss: 70Mb L: 78/119 MS: 1 CopyPart- 00:07:40.956 [2024-11-27 06:17:10.327255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.327283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.327335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.327351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.327404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.327422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.327474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.327490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.956 #48 NEW cov: 11911 ft: 15525 corp: 27/1773b lim: 120 exec/s: 48 rss: 70Mb L: 99/119 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:07:40.956 [2024-11-27 06:17:10.367226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2641 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.367255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.367320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.367337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.367390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.367405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.956 #50 NEW cov: 11911 ft: 15547 corp: 28/1854b lim: 120 exec/s: 50 rss: 70Mb L: 81/119 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:40.956 [2024-11-27 06:17:10.407487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.407514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.407550] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.407565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.407636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.407651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.407703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.407719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.956 #51 NEW cov: 11911 ft: 15580 corp: 29/1973b lim: 120 exec/s: 51 rss: 70Mb L: 119/119 MS: 1 CopyPart- 00:07:40.956 [2024-11-27 06:17:10.447327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.447356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.447404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728212994570 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.447421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.956 #52 NEW cov: 11911 ft: 15589 corp: 30/2031b lim: 120 exec/s: 52 rss: 70Mb L: 58/119 MS: 1 PersAutoDict- DE: "\003\000\000\000\000\000\000\000"- 00:07:40.956 [2024-11-27 06:17:10.487704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401732507303935 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.487732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.487777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.487793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.487846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.487861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.956 [2024-11-27 06:17:10.487913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.956 [2024-11-27 06:17:10.487928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.216 #53 NEW cov: 11911 ft: 15590 corp: 31/2130b lim: 120 exec/s: 53 rss: 70Mb L: 99/119 MS: 1 CrossOver- 00:07:41.216 [2024-11-27 06:17:10.527723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12804210591668482481 len:45490 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.216 [2024-11-27 06:17:10.527751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.216 [2024-11-27 06:17:10.527788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12804210592339571121 len:45490 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.216 [2024-11-27 06:17:10.527804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.216 [2024-11-27 06:17:10.527856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12804210592339571121 len:45490 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.216 [2024-11-27 06:17:10.527870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.216 #57 NEW cov: 11911 ft: 15593 corp: 32/2207b lim: 120 exec/s: 57 rss: 70Mb L: 77/119 MS: 4 CrossOver-ChangeBinInt-ChangeByte-InsertRepeatedBytes- 00:07:41.216 [2024-11-27 06:17:10.567645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.216 [2024-11-27 06:17:10.567673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.216 [2024-11-27 06:17:10.567721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.216 [2024-11-27 06:17:10.567735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.216 #58 NEW cov: 11911 ft: 15611 corp: 33/2255b lim: 120 exec/s: 58 rss: 70Mb L: 48/119 MS: 1 InsertByte- 00:07:41.216 [2024-11-27 06:17:10.608067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.216 [2024-11-27 06:17:10.608097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.216 [2024-11-27 06:17:10.608133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.216 [2024-11-27 06:17:10.608151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.216 [2024-11-27 06:17:10.608202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.217 [2024-11-27 06:17:10.608218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.217 [2024-11-27 06:17:10.608269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.217 [2024-11-27 06:17:10.608284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.217 #59 NEW cov: 11911 ft: 15640 corp: 34/2354b lim: 120 exec/s: 59 rss: 70Mb L: 99/119 MS: 1 ChangeByte- 00:07:41.217 [2024-11-27 06:17:10.648176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.217 [2024-11-27 06:17:10.648206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.217 [2024-11-27 06:17:10.648242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.217 [2024-11-27 06:17:10.648258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.217 [2024-11-27 06:17:10.648309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073692839935 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.217 [2024-11-27 06:17:10.648325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.217 [2024-11-27 06:17:10.648379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.217 [2024-11-27 06:17:10.648393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.217 #60 NEW cov: 11911 ft: 15701 corp: 35/2453b lim: 120 exec/s: 60 rss: 70Mb L: 99/119 MS: 1 ShuffleBytes- 00:07:41.217 [2024-11-27 06:17:10.688014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.217 [2024-11-27 06:17:10.688044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.217 [2024-11-27 06:17:10.688094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.217 [2024-11-27 06:17:10.688110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.217 #61 NEW cov: 11911 ft: 15715 corp: 36/2514b lim: 120 exec/s: 61 rss: 70Mb L: 61/119 MS: 1 CopyPart- 00:07:41.217 [2024-11-27 06:17:10.728417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.217 [2024-11-27 06:17:10.728446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.217 [2024-11-27 06:17:10.728488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744070219890687 len:65321 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.217 [2024-11-27 06:17:10.728504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.217 [2024-11-27 06:17:10.728557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.217 [2024-11-27 06:17:10.728577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.217 [2024-11-27 06:17:10.728633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.217 [2024-11-27 06:17:10.728649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.217 #62 NEW cov: 11911 ft: 15735 corp: 37/2615b lim: 120 exec/s: 62 rss: 70Mb L: 101/119 MS: 1 InsertByte- 00:07:41.476 [2024-11-27 06:17:10.768533] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.476 [2024-11-27 06:17:10.768561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.476 [2024-11-27 06:17:10.768603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709501183 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.476 [2024-11-27 06:17:10.768619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.476 [2024-11-27 06:17:10.768671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744069431296255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.476 [2024-11-27 06:17:10.768687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.476 [2024-11-27 06:17:10.768740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.476 [2024-11-27 06:17:10.768755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.476 #63 NEW cov: 11911 ft: 15745 corp: 38/2715b lim: 120 exec/s: 63 rss: 70Mb L: 100/119 MS: 1 InsertByte- 00:07:41.476 [2024-11-27 06:17:10.808724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.476 [2024-11-27 06:17:10.808752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.476 [2024-11-27 06:17:10.808794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.476 [2024-11-27 06:17:10.808808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.476 [2024-11-27 06:17:10.808861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.476 [2024-11-27 06:17:10.808876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.476 [2024-11-27 06:17:10.808930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.476 [2024-11-27 06:17:10.808944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.476 #64 NEW cov: 11911 ft: 15774 corp: 39/2814b lim: 120 exec/s: 64 rss: 70Mb L: 99/119 MS: 1 CopyPart- 00:07:41.477 [2024-11-27 06:17:10.848779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.477 [2024-11-27 06:17:10.848807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.477 [2024-11-27 06:17:10.848844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.477 [2024-11-27 06:17:10.848863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.477 [2024-11-27 06:17:10.848914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.477 [2024-11-27 06:17:10.848932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.477 [2024-11-27 06:17:10.848984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.477 [2024-11-27 06:17:10.848999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.477 #65 NEW cov: 11911 ft: 15776 corp: 40/2932b lim: 120 exec/s: 65 rss: 70Mb L: 118/119 MS: 1 CrossOver- 00:07:41.477 [2024-11-27 06:17:10.888404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.477 [2024-11-27 06:17:10.888432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.477 #66 NEW cov: 11911 ft: 15811 corp: 41/2976b lim: 120 exec/s: 66 rss: 70Mb L: 44/119 MS: 1 CrossOver- 00:07:41.477 [2024-11-27 06:17:10.928554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.477 [2024-11-27 06:17:10.928581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.477 #67 NEW cov: 11911 ft: 15859 corp: 42/3012b lim: 120 exec/s: 67 rss: 70Mb L: 36/119 MS: 1 EraseBytes- 00:07:41.477 [2024-11-27 06:17:10.968815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.477 [2024-11-27 06:17:10.968843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.477 [2024-11-27 06:17:10.968892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.477 [2024-11-27 06:17:10.968909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.477 [2024-11-27 06:17:10.998893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.477 [2024-11-27 06:17:10.998921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.477 [2024-11-27 06:17:10.998952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.477 [2024-11-27 06:17:10.998968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.737 #69 NEW cov: 11911 ft: 15888 corp: 43/3070b lim: 120 exec/s: 34 rss: 70Mb L: 58/119 MS: 2 ChangeBinInt-ShuffleBytes- 00:07:41.737 #69 DONE cov: 11911 ft: 15888 corp: 43/3070b lim: 120 exec/s: 34 rss: 70Mb 00:07:41.737 ###### Recommended dictionary. ###### 00:07:41.737 "\003\000\000\000\000\000\000\000" # Uses: 1 00:07:41.737 ###### End of recommended dictionary. ###### 00:07:41.737 Done 69 runs in 2 second(s) 00:07:41.737 06:17:11 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:07:41.737 06:17:11 -- ../common.sh@72 -- # (( i++ )) 00:07:41.737 06:17:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:41.737 06:17:11 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:07:41.737 06:17:11 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:07:41.737 06:17:11 -- nvmf/run.sh@24 -- # local timen=1 00:07:41.737 06:17:11 -- nvmf/run.sh@25 -- # local core=0x1 00:07:41.737 06:17:11 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:41.737 06:17:11 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:07:41.737 06:17:11 -- nvmf/run.sh@29 -- # printf %02d 18 00:07:41.737 06:17:11 -- nvmf/run.sh@29 -- # port=4418 00:07:41.737 06:17:11 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:41.737 06:17:11 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:07:41.737 06:17:11 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:41.737 06:17:11 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:07:41.737 [2024-11-27 06:17:11.158857] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:41.737 [2024-11-27 06:17:11.158914] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid37083 ] 00:07:41.737 EAL: No free 2048 kB hugepages reported on node 1 00:07:41.997 [2024-11-27 06:17:11.339579] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.997 [2024-11-27 06:17:11.404043] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:41.997 [2024-11-27 06:17:11.404187] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.997 [2024-11-27 06:17:11.462139] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:41.997 [2024-11-27 06:17:11.478514] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:07:41.997 INFO: Running with entropic power schedule (0xFF, 100). 00:07:41.997 INFO: Seed: 4293475784 00:07:41.997 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:41.997 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:41.997 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:41.997 INFO: A corpus is not provided, starting from an empty corpus 00:07:41.997 #2 INITED exec/s: 0 rss: 60Mb 00:07:41.997 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:41.997 This may also happen if the target rejected all inputs we tried so far 00:07:41.997 [2024-11-27 06:17:11.523471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:41.997 [2024-11-27 06:17:11.523500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.516 NEW_FUNC[1/670]: 0x457e58 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:07:42.516 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:42.516 #17 NEW cov: 11628 ft: 11629 corp: 2/30b lim: 100 exec/s: 0 rss: 68Mb L: 29/29 MS: 5 ShuffleBytes-ShuffleBytes-InsertByte-CrossOver-InsertRepeatedBytes- 00:07:42.516 [2024-11-27 06:17:11.824229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:42.516 [2024-11-27 06:17:11.824258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.516 #18 NEW cov: 11741 ft: 12193 corp: 3/53b lim: 100 exec/s: 0 rss: 68Mb L: 23/29 MS: 1 EraseBytes- 00:07:42.516 [2024-11-27 06:17:11.864501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:42.516 [2024-11-27 06:17:11.864527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.516 [2024-11-27 06:17:11.864567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:42.516 [2024-11-27 06:17:11.864581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.516 [2024-11-27 06:17:11.864641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:42.516 [2024-11-27 06:17:11.864655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.516 #19 NEW cov: 11747 ft: 12799 corp: 4/122b lim: 100 exec/s: 0 rss: 68Mb L: 69/69 MS: 1 InsertRepeatedBytes- 00:07:42.516 [2024-11-27 06:17:11.904620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:42.516 [2024-11-27 06:17:11.904648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.516 [2024-11-27 06:17:11.904699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:42.516 [2024-11-27 06:17:11.904712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.516 [2024-11-27 06:17:11.904773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:42.516 [2024-11-27 06:17:11.904786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.516 #20 NEW cov: 11832 ft: 13148 corp: 5/188b lim: 100 exec/s: 0 rss: 68Mb L: 66/69 MS: 1 EraseBytes- 00:07:42.516 [2024-11-27 06:17:11.944519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:42.516 [2024-11-27 06:17:11.944544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.516 #24 NEW cov: 11832 ft: 13348 corp: 6/222b lim: 100 exec/s: 0 rss: 68Mb L: 34/69 MS: 4 ShuffleBytes-CopyPart-CMP-InsertRepeatedBytes- DE: "\001\004\000\000\000\000\000\000"- 00:07:42.516 [2024-11-27 06:17:11.984857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:42.516 [2024-11-27 06:17:11.984883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.516 [2024-11-27 06:17:11.984918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:42.516 [2024-11-27 06:17:11.984932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.516 [2024-11-27 06:17:11.984984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:42.516 [2024-11-27 06:17:11.984998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.516 #25 NEW cov: 11832 ft: 13423 corp: 7/291b lim: 100 exec/s: 0 rss: 68Mb L: 69/69 MS: 1 CopyPart- 00:07:42.516 [2024-11-27 06:17:12.024723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:42.516 [2024-11-27 06:17:12.024749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.516 #26 NEW cov: 11832 ft: 13526 corp: 8/328b lim: 100 exec/s: 0 rss: 68Mb L: 37/69 MS: 1 PersAutoDict- DE: "\001\004\000\000\000\000\000\000"- 00:07:42.776 [2024-11-27 06:17:12.065072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:42.776 [2024-11-27 06:17:12.065098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.776 [2024-11-27 06:17:12.065143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:42.776 [2024-11-27 06:17:12.065157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.776 [2024-11-27 06:17:12.065209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:42.776 [2024-11-27 06:17:12.065229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.776 #27 NEW cov: 11832 ft: 13616 corp: 9/398b lim: 100 exec/s: 0 rss: 68Mb L: 70/70 MS: 1 CrossOver- 00:07:42.776 [2024-11-27 06:17:12.105213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:42.776 [2024-11-27 06:17:12.105239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.776 [2024-11-27 06:17:12.105288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:42.776 [2024-11-27 06:17:12.105303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.776 [2024-11-27 06:17:12.105356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:42.776 [2024-11-27 06:17:12.105370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.776 #28 NEW cov: 11832 ft: 13695 corp: 10/467b lim: 100 exec/s: 0 rss: 69Mb L: 69/70 MS: 1 ChangeByte- 00:07:42.776 [2024-11-27 06:17:12.145293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:42.776 [2024-11-27 06:17:12.145319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.776 [2024-11-27 06:17:12.145355] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:42.776 [2024-11-27 06:17:12.145369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.776 [2024-11-27 06:17:12.145420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:42.776 [2024-11-27 06:17:12.145434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.776 #29 NEW cov: 11832 ft: 13809 corp: 11/533b lim: 100 exec/s: 0 rss: 69Mb L: 66/70 MS: 1 ChangeBinInt- 00:07:42.776 [2024-11-27 06:17:12.185216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:42.776 [2024-11-27 06:17:12.185243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.776 #30 NEW cov: 11832 ft: 13841 corp: 12/570b lim: 100 exec/s: 0 rss: 69Mb L: 37/70 MS: 1 ChangeBit- 00:07:42.776 [2024-11-27 06:17:12.225339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:42.776 [2024-11-27 06:17:12.225365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.776 #31 NEW cov: 11832 ft: 13887 corp: 13/604b lim: 100 exec/s: 0 rss: 69Mb L: 34/70 MS: 1 ChangeBit- 00:07:42.776 [2024-11-27 06:17:12.265818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:42.776 [2024-11-27 06:17:12.265845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.776 [2024-11-27 06:17:12.265902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:42.776 [2024-11-27 06:17:12.265918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.776 [2024-11-27 06:17:12.265968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:42.776 [2024-11-27 06:17:12.265983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.776 [2024-11-27 06:17:12.266038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:42.776 [2024-11-27 06:17:12.266052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.776 #32 NEW cov: 11832 ft: 14192 corp: 14/684b lim: 100 exec/s: 0 rss: 69Mb L: 80/80 MS: 1 InsertRepeatedBytes- 00:07:43.036 [2024-11-27 06:17:12.315867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.036 [2024-11-27 06:17:12.315893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.036 [2024-11-27 06:17:12.315944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.036 [2024-11-27 06:17:12.315958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.036 [2024-11-27 06:17:12.316011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.036 [2024-11-27 06:17:12.316025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.036 #33 NEW cov: 11832 ft: 14215 corp: 15/753b lim: 100 exec/s: 0 rss: 69Mb L: 69/80 MS: 1 ShuffleBytes- 00:07:43.036 [2024-11-27 06:17:12.355933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.036 [2024-11-27 06:17:12.355959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.036 [2024-11-27 06:17:12.355992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.036 [2024-11-27 06:17:12.356005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.036 [2024-11-27 06:17:12.356056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.036 [2024-11-27 06:17:12.356069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.036 #34 NEW cov: 11832 ft: 14232 corp: 16/822b lim: 100 exec/s: 0 rss: 69Mb L: 69/80 MS: 1 ChangeBinInt- 00:07:43.036 [2024-11-27 06:17:12.395802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.036 [2024-11-27 06:17:12.395827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.036 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:43.036 #35 NEW cov: 11855 ft: 14295 corp: 17/856b lim: 100 exec/s: 0 rss: 69Mb L: 34/80 MS: 1 ShuffleBytes- 00:07:43.036 [2024-11-27 06:17:12.436172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.036 [2024-11-27 06:17:12.436199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.036 [2024-11-27 06:17:12.436234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.036 [2024-11-27 06:17:12.436248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.036 [2024-11-27 06:17:12.436298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.036 [2024-11-27 06:17:12.436312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.036 #36 NEW cov: 11855 ft: 14311 corp: 18/922b lim: 100 exec/s: 0 rss: 69Mb L: 66/80 MS: 1 ChangeByte- 00:07:43.036 [2024-11-27 06:17:12.476304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.036 [2024-11-27 06:17:12.476330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.036 [2024-11-27 06:17:12.476364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.036 [2024-11-27 06:17:12.476378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.036 [2024-11-27 06:17:12.476434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.036 [2024-11-27 06:17:12.476449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.036 #37 NEW cov: 11855 ft: 14324 corp: 19/992b lim: 100 exec/s: 0 rss: 69Mb L: 70/80 MS: 1 InsertByte- 00:07:43.036 [2024-11-27 06:17:12.516414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.036 [2024-11-27 06:17:12.516441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.036 [2024-11-27 06:17:12.516495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.036 [2024-11-27 06:17:12.516511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.036 [2024-11-27 06:17:12.516565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.036 [2024-11-27 06:17:12.516580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.036 #38 NEW cov: 11855 ft: 14333 corp: 20/1062b lim: 100 exec/s: 38 rss: 69Mb L: 70/80 MS: 1 InsertRepeatedBytes- 00:07:43.036 [2024-11-27 06:17:12.556297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.036 [2024-11-27 06:17:12.556324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.297 #39 NEW cov: 11855 ft: 14354 corp: 21/1096b lim: 100 exec/s: 39 rss: 69Mb L: 34/80 MS: 1 ChangeBit- 00:07:43.297 [2024-11-27 06:17:12.596654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.297 [2024-11-27 06:17:12.596680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.596742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.297 [2024-11-27 06:17:12.596757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.596807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.297 [2024-11-27 06:17:12.596822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.297 #40 NEW cov: 11855 ft: 14381 corp: 22/1164b lim: 100 exec/s: 40 rss: 69Mb L: 68/80 MS: 1 CrossOver- 00:07:43.297 [2024-11-27 06:17:12.636765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.297 [2024-11-27 06:17:12.636791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.636838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.297 [2024-11-27 06:17:12.636851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.636904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.297 [2024-11-27 06:17:12.636933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.297 #41 NEW cov: 11855 ft: 14385 corp: 23/1233b lim: 100 exec/s: 41 rss: 69Mb L: 69/80 MS: 1 ChangeBit- 00:07:43.297 [2024-11-27 06:17:12.676915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.297 [2024-11-27 06:17:12.676941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.676977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.297 [2024-11-27 06:17:12.676994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.677045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.297 [2024-11-27 06:17:12.677059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.297 #42 NEW cov: 11855 ft: 14403 corp: 24/1303b lim: 100 exec/s: 42 rss: 69Mb L: 70/80 MS: 1 ChangeBit- 00:07:43.297 [2024-11-27 06:17:12.716995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.297 [2024-11-27 06:17:12.717020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.717066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.297 [2024-11-27 06:17:12.717081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.717129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.297 [2024-11-27 06:17:12.717143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.297 #43 NEW cov: 11855 ft: 14420 corp: 25/1372b lim: 100 exec/s: 43 rss: 70Mb L: 69/80 MS: 1 CopyPart- 00:07:43.297 [2024-11-27 06:17:12.757105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.297 [2024-11-27 06:17:12.757131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.757177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.297 [2024-11-27 06:17:12.757191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.757242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.297 [2024-11-27 06:17:12.757257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.297 #44 NEW cov: 11855 ft: 14444 corp: 26/1441b lim: 100 exec/s: 44 rss: 70Mb L: 69/80 MS: 1 ShuffleBytes- 00:07:43.297 [2024-11-27 06:17:12.797354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.297 [2024-11-27 06:17:12.797379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.797430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.297 [2024-11-27 06:17:12.797442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.797493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.297 [2024-11-27 06:17:12.797506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.297 [2024-11-27 06:17:12.797560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:43.297 [2024-11-27 06:17:12.797572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:43.297 #45 NEW cov: 11855 ft: 14449 corp: 27/1532b lim: 100 exec/s: 45 rss: 70Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:07:43.557 [2024-11-27 06:17:12.837324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.557 [2024-11-27 06:17:12.837349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.557 [2024-11-27 06:17:12.837410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.557 [2024-11-27 06:17:12.837424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.557 [2024-11-27 06:17:12.837475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.557 [2024-11-27 06:17:12.837489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.557 #46 NEW cov: 11855 ft: 14453 corp: 28/1603b lim: 100 exec/s: 46 rss: 70Mb L: 71/91 MS: 1 InsertByte- 00:07:43.557 [2024-11-27 06:17:12.877190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.557 [2024-11-27 06:17:12.877215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.557 #47 NEW cov: 11855 ft: 14465 corp: 29/1637b lim: 100 exec/s: 47 rss: 70Mb L: 34/91 MS: 1 ShuffleBytes- 00:07:43.557 [2024-11-27 06:17:12.917556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.557 [2024-11-27 06:17:12.917582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.557 [2024-11-27 06:17:12.917630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.557 [2024-11-27 06:17:12.917644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.557 [2024-11-27 06:17:12.917696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.557 [2024-11-27 06:17:12.917725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.557 #48 NEW cov: 11855 ft: 14493 corp: 30/1707b lim: 100 exec/s: 48 rss: 70Mb L: 70/91 MS: 1 ChangeBinInt- 00:07:43.557 [2024-11-27 06:17:12.957431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.557 [2024-11-27 06:17:12.957456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.557 #49 NEW cov: 11855 ft: 14508 corp: 31/1727b lim: 100 exec/s: 49 rss: 70Mb L: 20/91 MS: 1 EraseBytes- 00:07:43.557 [2024-11-27 06:17:12.997678] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.557 [2024-11-27 06:17:12.997703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.557 [2024-11-27 06:17:12.997739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.557 [2024-11-27 06:17:12.997753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.557 #50 NEW cov: 11855 ft: 14805 corp: 32/1777b lim: 100 exec/s: 50 rss: 70Mb L: 50/91 MS: 1 InsertRepeatedBytes- 00:07:43.557 [2024-11-27 06:17:13.037636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.557 [2024-11-27 06:17:13.037661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.557 #51 NEW cov: 11855 ft: 14810 corp: 33/1798b lim: 100 exec/s: 51 rss: 70Mb L: 21/91 MS: 1 InsertByte- 00:07:43.557 [2024-11-27 06:17:13.077779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.557 [2024-11-27 06:17:13.077804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.817 #52 NEW cov: 11855 ft: 14841 corp: 34/1821b lim: 100 exec/s: 52 rss: 70Mb L: 23/91 MS: 1 ShuffleBytes- 00:07:43.817 [2024-11-27 06:17:13.118242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.817 [2024-11-27 06:17:13.118266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.118306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.817 [2024-11-27 06:17:13.118321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.118373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.817 [2024-11-27 06:17:13.118387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.118437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:43.817 [2024-11-27 06:17:13.118451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:43.817 #53 NEW cov: 11855 ft: 14857 corp: 35/1912b lim: 100 exec/s: 53 rss: 70Mb L: 91/91 MS: 1 CopyPart- 00:07:43.817 [2024-11-27 06:17:13.158364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.817 [2024-11-27 06:17:13.158389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.158439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.817 [2024-11-27 06:17:13.158451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.158502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.817 [2024-11-27 06:17:13.158515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.158567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:43.817 [2024-11-27 06:17:13.158579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:43.817 #54 NEW cov: 11855 ft: 14863 corp: 36/2011b lim: 100 exec/s: 54 rss: 70Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:07:43.817 [2024-11-27 06:17:13.198436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.817 [2024-11-27 06:17:13.198461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.198509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.817 [2024-11-27 06:17:13.198524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.198574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.817 [2024-11-27 06:17:13.198607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.198663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:43.817 [2024-11-27 06:17:13.198677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:43.817 #55 NEW cov: 11855 ft: 14931 corp: 37/2105b lim: 100 exec/s: 55 rss: 70Mb L: 94/99 MS: 1 InsertRepeatedBytes- 00:07:43.817 [2024-11-27 06:17:13.238458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.817 [2024-11-27 06:17:13.238483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.238524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.817 [2024-11-27 06:17:13.238538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.238592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.817 [2024-11-27 06:17:13.238612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.817 #56 NEW cov: 11855 ft: 14944 corp: 38/2175b lim: 100 exec/s: 56 rss: 70Mb L: 70/99 MS: 1 ChangeBinInt- 00:07:43.817 [2024-11-27 06:17:13.278604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.817 [2024-11-27 06:17:13.278629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.278677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:43.817 [2024-11-27 06:17:13.278691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:43.817 [2024-11-27 06:17:13.278742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:43.817 [2024-11-27 06:17:13.278756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:43.817 #57 NEW cov: 11855 ft: 14950 corp: 39/2243b lim: 100 exec/s: 57 rss: 70Mb L: 68/99 MS: 1 CMP- DE: "\231\001\000\000"- 00:07:43.817 [2024-11-27 06:17:13.318469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:43.817 [2024-11-27 06:17:13.318494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:43.817 #58 NEW cov: 11855 ft: 14973 corp: 40/2263b lim: 100 exec/s: 58 rss: 70Mb L: 20/99 MS: 1 EraseBytes- 00:07:44.077 [2024-11-27 06:17:13.358573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:44.077 [2024-11-27 06:17:13.358601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.077 #59 NEW cov: 11855 ft: 15045 corp: 41/2291b lim: 100 exec/s: 59 rss: 70Mb L: 28/99 MS: 1 CMP- DE: "\004\000\000\000\000\000\000\000"- 00:07:44.077 [2024-11-27 06:17:13.399054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:44.077 [2024-11-27 06:17:13.399080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.077 [2024-11-27 06:17:13.399119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:44.077 [2024-11-27 06:17:13.399133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:44.077 [2024-11-27 06:17:13.399186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:44.077 [2024-11-27 06:17:13.399199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:44.077 [2024-11-27 06:17:13.399251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:44.077 [2024-11-27 06:17:13.399265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:44.077 #60 NEW cov: 11855 ft: 15058 corp: 42/2376b lim: 100 exec/s: 60 rss: 70Mb L: 85/99 MS: 1 CopyPart- 00:07:44.077 [2024-11-27 06:17:13.439255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:44.077 [2024-11-27 06:17:13.439281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.077 [2024-11-27 06:17:13.439334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:44.077 [2024-11-27 06:17:13.439347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:44.077 [2024-11-27 06:17:13.439402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:44.077 [2024-11-27 06:17:13.439415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:44.077 [2024-11-27 06:17:13.439465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:44.077 [2024-11-27 06:17:13.439479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:44.077 [2024-11-27 06:17:13.439531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:07:44.077 [2024-11-27 06:17:13.439545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:44.077 #61 NEW cov: 11855 ft: 15108 corp: 43/2476b lim: 100 exec/s: 61 rss: 70Mb L: 100/100 MS: 1 InsertByte- 00:07:44.077 [2024-11-27 06:17:13.479294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:44.077 [2024-11-27 06:17:13.479320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.077 [2024-11-27 06:17:13.479360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:44.077 [2024-11-27 06:17:13.479375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:44.077 [2024-11-27 06:17:13.479429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:44.077 [2024-11-27 06:17:13.479459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:44.077 [2024-11-27 06:17:13.479513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:44.077 [2024-11-27 06:17:13.479527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:44.077 #62 NEW cov: 11855 ft: 15115 corp: 44/2570b lim: 100 exec/s: 62 rss: 70Mb L: 94/100 MS: 1 CopyPart- 00:07:44.077 [2024-11-27 06:17:13.519294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:44.077 [2024-11-27 06:17:13.519320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.077 [2024-11-27 06:17:13.519356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:44.077 [2024-11-27 06:17:13.519371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:44.077 [2024-11-27 06:17:13.519422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:44.077 [2024-11-27 06:17:13.519436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:44.077 #63 NEW cov: 11855 ft: 15120 corp: 45/2640b lim: 100 exec/s: 31 rss: 70Mb L: 70/100 MS: 1 ChangeByte- 00:07:44.077 #63 DONE cov: 11855 ft: 15120 corp: 45/2640b lim: 100 exec/s: 31 rss: 70Mb 00:07:44.077 ###### Recommended dictionary. ###### 00:07:44.077 "\001\004\000\000\000\000\000\000" # Uses: 1 00:07:44.077 "\231\001\000\000" # Uses: 0 00:07:44.077 "\004\000\000\000\000\000\000\000" # Uses: 0 00:07:44.077 ###### End of recommended dictionary. ###### 00:07:44.077 Done 63 runs in 2 second(s) 00:07:44.337 06:17:13 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:07:44.337 06:17:13 -- ../common.sh@72 -- # (( i++ )) 00:07:44.337 06:17:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:44.337 06:17:13 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:07:44.337 06:17:13 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:07:44.337 06:17:13 -- nvmf/run.sh@24 -- # local timen=1 00:07:44.337 06:17:13 -- nvmf/run.sh@25 -- # local core=0x1 00:07:44.337 06:17:13 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:44.337 06:17:13 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:07:44.337 06:17:13 -- nvmf/run.sh@29 -- # printf %02d 19 00:07:44.337 06:17:13 -- nvmf/run.sh@29 -- # port=4419 00:07:44.337 06:17:13 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:44.337 06:17:13 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:07:44.337 06:17:13 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:44.337 06:17:13 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:07:44.337 [2024-11-27 06:17:13.706616] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:44.337 [2024-11-27 06:17:13.706686] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid37414 ] 00:07:44.337 EAL: No free 2048 kB hugepages reported on node 1 00:07:44.597 [2024-11-27 06:17:13.880796] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.597 [2024-11-27 06:17:13.946430] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:44.597 [2024-11-27 06:17:13.946572] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.597 [2024-11-27 06:17:14.004578] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:44.597 [2024-11-27 06:17:14.020946] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:07:44.597 INFO: Running with entropic power schedule (0xFF, 100). 00:07:44.597 INFO: Seed: 2540511074 00:07:44.597 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:44.597 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:44.597 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:44.597 INFO: A corpus is not provided, starting from an empty corpus 00:07:44.597 #2 INITED exec/s: 0 rss: 60Mb 00:07:44.597 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:44.597 This may also happen if the target rejected all inputs we tried so far 00:07:44.597 [2024-11-27 06:17:14.090562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:44.597 [2024-11-27 06:17:14.090604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:44.857 NEW_FUNC[1/669]: 0x45ae18 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:07:44.857 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:44.857 #3 NEW cov: 11602 ft: 11603 corp: 2/17b lim: 50 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 InsertRepeatedBytes- 00:07:45.117 [2024-11-27 06:17:14.421413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:45.117 [2024-11-27 06:17:14.421468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.117 NEW_FUNC[1/1]: 0x170aae8 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1456 00:07:45.117 #4 NEW cov: 11719 ft: 12213 corp: 3/34b lim: 50 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 InsertByte- 00:07:45.117 [2024-11-27 06:17:14.471453] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65291 00:07:45.117 [2024-11-27 06:17:14.471480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.117 #5 NEW cov: 11725 ft: 12489 corp: 4/45b lim: 50 exec/s: 0 rss: 68Mb L: 11/17 MS: 1 EraseBytes- 00:07:45.117 [2024-11-27 06:17:14.512052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:45.117 [2024-11-27 06:17:14.512088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.117 [2024-11-27 06:17:14.512169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18377501233733697535 len:1 00:07:45.117 [2024-11-27 06:17:14.512189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.117 [2024-11-27 06:17:14.512318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:45.117 [2024-11-27 06:17:14.512339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.117 [2024-11-27 06:17:14.512450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:07:45.117 [2024-11-27 06:17:14.512473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:45.117 #6 NEW cov: 11810 ft: 13099 corp: 5/90b lim: 50 exec/s: 0 rss: 68Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:07:45.117 [2024-11-27 06:17:14.551752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65323 00:07:45.117 [2024-11-27 06:17:14.551779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.117 #12 NEW cov: 11810 ft: 13214 corp: 6/101b lim: 50 exec/s: 0 rss: 68Mb L: 11/45 MS: 1 ChangeBit- 00:07:45.117 [2024-11-27 06:17:14.592139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:45.117 [2024-11-27 06:17:14.592172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.117 [2024-11-27 06:17:14.592281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18377501233733697535 len:1 00:07:45.117 [2024-11-27 06:17:14.592302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.117 [2024-11-27 06:17:14.592411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:45.117 [2024-11-27 06:17:14.592435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.117 #13 NEW cov: 11810 ft: 13529 corp: 7/132b lim: 50 exec/s: 0 rss: 69Mb L: 31/45 MS: 1 EraseBytes- 00:07:45.117 [2024-11-27 06:17:14.632401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:8319119876378817395 len:29556 00:07:45.117 [2024-11-27 06:17:14.632431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.117 [2024-11-27 06:17:14.632506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8319119876378817395 len:29556 00:07:45.117 [2024-11-27 06:17:14.632527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.117 [2024-11-27 06:17:14.632643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8319119876378817395 len:29556 00:07:45.117 [2024-11-27 06:17:14.632667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.117 [2024-11-27 06:17:14.632781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:8319119876378817395 len:29556 00:07:45.117 [2024-11-27 06:17:14.632803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:45.377 #17 NEW cov: 11810 ft: 13639 corp: 8/178b lim: 50 exec/s: 0 rss: 69Mb L: 46/46 MS: 4 ShuffleBytes-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:07:45.377 [2024-11-27 06:17:14.672571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:45.377 [2024-11-27 06:17:14.672603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.377 [2024-11-27 06:17:14.672717] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18377501233733697535 len:1 00:07:45.377 [2024-11-27 06:17:14.672739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.377 [2024-11-27 06:17:14.672852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:45.377 [2024-11-27 06:17:14.672870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.377 [2024-11-27 06:17:14.672995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:35184372088832 len:1 00:07:45.377 [2024-11-27 06:17:14.673015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:45.377 #18 NEW cov: 11810 ft: 13657 corp: 9/223b lim: 50 exec/s: 0 rss: 69Mb L: 45/46 MS: 1 ChangeBit- 00:07:45.377 [2024-11-27 06:17:14.712152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:4097 00:07:45.377 [2024-11-27 06:17:14.712178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.377 #19 NEW cov: 11810 ft: 13697 corp: 10/239b lim: 50 exec/s: 0 rss: 69Mb L: 16/46 MS: 1 ChangeBinInt- 00:07:45.377 [2024-11-27 06:17:14.752420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4294967295 len:1 00:07:45.377 [2024-11-27 06:17:14.752452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.377 [2024-11-27 06:17:14.752564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9007199254740992 len:1 00:07:45.377 [2024-11-27 06:17:14.752582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.377 #20 NEW cov: 11810 ft: 13970 corp: 11/263b lim: 50 exec/s: 0 rss: 69Mb L: 24/46 MS: 1 EraseBytes- 00:07:45.377 [2024-11-27 06:17:14.792968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:8319119876378817395 len:29655 00:07:45.377 [2024-11-27 06:17:14.792998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.377 [2024-11-27 06:17:14.793096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8319119876378817395 len:29556 00:07:45.377 [2024-11-27 06:17:14.793116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.377 [2024-11-27 06:17:14.793224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8319119876378817395 len:29556 00:07:45.378 [2024-11-27 06:17:14.793245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.378 [2024-11-27 06:17:14.793363] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:8319119876378817395 len:29556 00:07:45.378 [2024-11-27 06:17:14.793385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:45.378 #21 NEW cov: 11810 ft: 14016 corp: 12/309b lim: 50 exec/s: 0 rss: 69Mb L: 46/46 MS: 1 ChangeByte- 00:07:45.378 [2024-11-27 06:17:14.842592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:45.378 [2024-11-27 06:17:14.842639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.378 #22 NEW cov: 11810 ft: 14037 corp: 13/321b lim: 50 exec/s: 0 rss: 69Mb L: 12/46 MS: 1 InsertByte- 00:07:45.378 [2024-11-27 06:17:14.882847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4294967040 len:1 00:07:45.378 [2024-11-27 06:17:14.882881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.378 [2024-11-27 06:17:14.883005] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744069431361535 len:10877 00:07:45.378 [2024-11-27 06:17:14.883022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.378 #23 NEW cov: 11810 ft: 14078 corp: 14/341b lim: 50 exec/s: 0 rss: 69Mb L: 20/46 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:45.637 [2024-11-27 06:17:14.922843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446743025737531391 len:65323 00:07:45.637 [2024-11-27 06:17:14.922877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.637 #24 NEW cov: 11810 ft: 14167 corp: 15/352b lim: 50 exec/s: 0 rss: 69Mb L: 11/46 MS: 1 ChangeBinInt- 00:07:45.637 [2024-11-27 06:17:14.962918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:72057598332895231 len:4097 00:07:45.637 [2024-11-27 06:17:14.962951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.637 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:45.637 #25 NEW cov: 11833 ft: 14283 corp: 16/368b lim: 50 exec/s: 0 rss: 69Mb L: 16/46 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:45.637 [2024-11-27 06:17:15.003041] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:72057598332895231 len:4343 00:07:45.637 [2024-11-27 06:17:15.003069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.637 #26 NEW cov: 11833 ft: 14307 corp: 17/384b lim: 50 exec/s: 0 rss: 69Mb L: 16/46 MS: 1 ChangeByte- 00:07:45.637 [2024-11-27 06:17:15.043196] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1099511627521 len:4097 00:07:45.637 [2024-11-27 06:17:15.043223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.637 #27 NEW cov: 11833 ft: 14317 corp: 18/400b lim: 50 exec/s: 0 rss: 69Mb L: 16/46 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:45.637 [2024-11-27 06:17:15.083683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:45.638 [2024-11-27 06:17:15.083717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.638 [2024-11-27 06:17:15.083839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:72057594037927696 len:65536 00:07:45.638 [2024-11-27 06:17:15.083859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.638 [2024-11-27 06:17:15.083981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:72057594037927696 len:65536 00:07:45.638 [2024-11-27 06:17:15.084004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.638 #28 NEW cov: 11833 ft: 14398 corp: 19/431b lim: 50 exec/s: 28 rss: 69Mb L: 31/46 MS: 1 CopyPart- 00:07:45.638 [2024-11-27 06:17:15.123514] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374686483966590975 len:1 00:07:45.638 [2024-11-27 06:17:15.123547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.638 [2024-11-27 06:17:15.123668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446743506790645759 len:1 00:07:45.638 [2024-11-27 06:17:15.123684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.638 #30 NEW cov: 11833 ft: 14411 corp: 20/459b lim: 50 exec/s: 30 rss: 69Mb L: 28/46 MS: 2 EraseBytes-CrossOver- 00:07:45.638 [2024-11-27 06:17:15.164006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:72057598332895231 len:65536 00:07:45.638 [2024-11-27 06:17:15.164036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.638 [2024-11-27 06:17:15.164122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65291 00:07:45.638 [2024-11-27 06:17:15.164145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.638 [2024-11-27 06:17:15.164258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:45.638 [2024-11-27 06:17:15.164280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.638 [2024-11-27 06:17:15.164394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:07:45.638 [2024-11-27 06:17:15.164415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:45.897 #31 NEW cov: 11833 ft: 14423 corp: 21/508b lim: 50 exec/s: 31 rss: 69Mb L: 49/49 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:45.897 [2024-11-27 06:17:15.203724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:45.897 [2024-11-27 06:17:15.203750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.897 #32 NEW cov: 11833 ft: 14455 corp: 22/525b lim: 50 exec/s: 32 rss: 69Mb L: 17/49 MS: 1 ChangeByte- 00:07:45.897 [2024-11-27 06:17:15.243776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:72057598332895231 len:65536 00:07:45.897 [2024-11-27 06:17:15.243804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.897 #33 NEW cov: 11833 ft: 14464 corp: 23/541b lim: 50 exec/s: 33 rss: 69Mb L: 16/49 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:45.897 [2024-11-27 06:17:15.284459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:72057598332895231 len:65536 00:07:45.897 [2024-11-27 06:17:15.284491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.897 [2024-11-27 06:17:15.284585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65291 00:07:45.897 [2024-11-27 06:17:15.284605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.897 [2024-11-27 06:17:15.284723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:45.897 [2024-11-27 06:17:15.284748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.897 [2024-11-27 06:17:15.284864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:07:45.897 [2024-11-27 06:17:15.284885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:45.897 #34 NEW cov: 11833 ft: 14470 corp: 24/590b lim: 50 exec/s: 34 rss: 69Mb L: 49/49 MS: 1 ShuffleBytes- 00:07:45.897 [2024-11-27 06:17:15.324169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4279173120 len:256 00:07:45.897 [2024-11-27 06:17:15.324201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.897 [2024-11-27 06:17:15.324321] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:45.898 [2024-11-27 06:17:15.324342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.898 #35 NEW cov: 11833 ft: 14505 corp: 25/615b lim: 50 exec/s: 35 rss: 69Mb L: 25/49 MS: 1 CMP- DE: "\017\000\000\000\000\000\000\000"- 00:07:45.898 [2024-11-27 06:17:15.374349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4294967295 len:1 00:07:45.898 [2024-11-27 06:17:15.374386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.898 [2024-11-27 06:17:15.374508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9007199254740992 len:1 00:07:45.898 [2024-11-27 06:17:15.374534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.898 #36 NEW cov: 11833 ft: 14513 corp: 26/639b lim: 50 exec/s: 36 rss: 70Mb L: 24/49 MS: 1 ShuffleBytes- 00:07:45.898 [2024-11-27 06:17:15.424380] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:45.898 [2024-11-27 06:17:15.424407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.158 #37 NEW cov: 11833 ft: 14524 corp: 27/655b lim: 50 exec/s: 37 rss: 70Mb L: 16/49 MS: 1 ShuffleBytes- 00:07:46.158 [2024-11-27 06:17:15.464716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:72057598332895231 len:4097 00:07:46.158 [2024-11-27 06:17:15.464744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.158 [2024-11-27 06:17:15.464862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4504699138998271 len:65536 00:07:46.158 [2024-11-27 06:17:15.464886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.158 #38 NEW cov: 11833 ft: 14537 corp: 28/680b lim: 50 exec/s: 38 rss: 70Mb L: 25/49 MS: 1 CopyPart- 00:07:46.158 [2024-11-27 06:17:15.505049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:46.158 [2024-11-27 06:17:15.505083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.158 [2024-11-27 06:17:15.505187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18377501233733697535 len:1 00:07:46.158 [2024-11-27 06:17:15.505212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.158 [2024-11-27 06:17:15.505330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:46.158 [2024-11-27 06:17:15.505351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.158 [2024-11-27 06:17:15.505469] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18331057858281472 len:1 00:07:46.158 [2024-11-27 06:17:15.505490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:46.158 #39 NEW cov: 11833 ft: 14577 corp: 29/725b lim: 50 exec/s: 39 rss: 70Mb L: 45/49 MS: 1 ChangeByte- 00:07:46.158 [2024-11-27 06:17:15.544820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4294967040 len:1 00:07:46.158 [2024-11-27 06:17:15.544851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.158 [2024-11-27 06:17:15.544970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18417752146830163967 len:65323 00:07:46.158 [2024-11-27 06:17:15.544992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.158 #40 NEW cov: 11833 ft: 14608 corp: 30/746b lim: 50 exec/s: 40 rss: 70Mb L: 21/49 MS: 1 InsertByte- 00:07:46.158 [2024-11-27 06:17:15.594902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742974381817855 len:1 00:07:46.158 [2024-11-27 06:17:15.594929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.158 #43 NEW cov: 11833 ft: 14614 corp: 31/761b lim: 50 exec/s: 43 rss: 70Mb L: 15/49 MS: 3 CopyPart-ChangeBinInt-CrossOver- 00:07:46.158 [2024-11-27 06:17:15.634928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:53198770767069184 len:1 00:07:46.158 [2024-11-27 06:17:15.634963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.158 #46 NEW cov: 11833 ft: 14646 corp: 32/771b lim: 50 exec/s: 46 rss: 70Mb L: 10/49 MS: 3 CopyPart-PersAutoDict-InsertByte- DE: "\017\000\000\000\000\000\000\000"- 00:07:46.158 [2024-11-27 06:17:15.675584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:72057598332895231 len:65536 00:07:46.158 [2024-11-27 06:17:15.675622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.158 [2024-11-27 06:17:15.675708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65291 00:07:46.158 [2024-11-27 06:17:15.675730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.158 [2024-11-27 06:17:15.675846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:46.158 [2024-11-27 06:17:15.675867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.158 [2024-11-27 06:17:15.675981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:11264 len:1 00:07:46.158 [2024-11-27 06:17:15.676005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:46.418 #47 NEW cov: 11833 ft: 14652 corp: 33/820b lim: 50 exec/s: 47 rss: 70Mb L: 49/49 MS: 1 ChangeByte- 00:07:46.418 [2024-11-27 06:17:15.725604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:72057598332895231 len:65536 00:07:46.418 [2024-11-27 06:17:15.725639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.418 [2024-11-27 06:17:15.725749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65291 00:07:46.418 [2024-11-27 06:17:15.725771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.418 [2024-11-27 06:17:15.725894] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:167772160 len:1 00:07:46.418 [2024-11-27 06:17:15.725918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.418 #48 NEW cov: 11833 ft: 14658 corp: 34/857b lim: 50 exec/s: 48 rss: 70Mb L: 37/49 MS: 1 CrossOver- 00:07:46.418 [2024-11-27 06:17:15.775897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:46.418 [2024-11-27 06:17:15.775927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.418 [2024-11-27 06:17:15.776012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18377501233733697535 len:1 00:07:46.418 [2024-11-27 06:17:15.776032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.418 [2024-11-27 06:17:15.776143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:46.418 [2024-11-27 06:17:15.776164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.418 [2024-11-27 06:17:15.776279] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744069431361535 len:65536 00:07:46.418 [2024-11-27 06:17:15.776309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:46.418 #49 NEW cov: 11833 ft: 14669 corp: 35/902b lim: 50 exec/s: 49 rss: 70Mb L: 45/49 MS: 1 CopyPart- 00:07:46.418 [2024-11-27 06:17:15.815463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:257 00:07:46.418 [2024-11-27 06:17:15.815495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.418 #50 NEW cov: 11833 ft: 14674 corp: 36/917b lim: 50 exec/s: 50 rss: 70Mb L: 15/49 MS: 1 CrossOver- 00:07:46.418 [2024-11-27 06:17:15.845858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:46.418 [2024-11-27 06:17:15.845890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.418 [2024-11-27 06:17:15.845976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:71879473154227984 len:65536 00:07:46.418 [2024-11-27 06:17:15.845997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.418 [2024-11-27 06:17:15.846108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1153202979583557631 len:65536 00:07:46.418 [2024-11-27 06:17:15.846130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.418 #51 NEW cov: 11833 ft: 14677 corp: 37/949b lim: 50 exec/s: 51 rss: 70Mb L: 32/49 MS: 1 InsertByte- 00:07:46.418 [2024-11-27 06:17:15.895935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:8319119876378817395 len:29655 00:07:46.418 [2024-11-27 06:17:15.895962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.418 [2024-11-27 06:17:15.896076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8319119876378817395 len:29556 00:07:46.418 [2024-11-27 06:17:15.896098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.418 #52 NEW cov: 11833 ft: 14687 corp: 38/972b lim: 50 exec/s: 52 rss: 70Mb L: 23/49 MS: 1 EraseBytes- 00:07:46.418 [2024-11-27 06:17:15.946367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:8319119876378819443 len:29556 00:07:46.418 [2024-11-27 06:17:15.946403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.418 [2024-11-27 06:17:15.946525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8319119876378817395 len:29556 00:07:46.418 [2024-11-27 06:17:15.946546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.418 [2024-11-27 06:17:15.946667] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8319119876378817395 len:29556 00:07:46.418 [2024-11-27 06:17:15.946685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.418 [2024-11-27 06:17:15.946793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:8319119876378817395 len:29556 00:07:46.418 [2024-11-27 06:17:15.946812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:46.678 #53 NEW cov: 11833 ft: 14705 corp: 39/1018b lim: 50 exec/s: 53 rss: 70Mb L: 46/49 MS: 1 ChangeBit- 00:07:46.678 [2024-11-27 06:17:15.986106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:46.678 [2024-11-27 06:17:15.986135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.678 [2024-11-27 06:17:15.986248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18374967954648334335 len:10877 00:07:46.678 [2024-11-27 06:17:15.986275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.678 #54 NEW cov: 11833 ft: 14713 corp: 40/1038b lim: 50 exec/s: 54 rss: 70Mb L: 20/49 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\000"- 00:07:46.678 [2024-11-27 06:17:16.026209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:72057598332895231 len:4095 00:07:46.678 [2024-11-27 06:17:16.026240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.678 [2024-11-27 06:17:16.026366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4504699138998271 len:65536 00:07:46.678 [2024-11-27 06:17:16.026388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.678 #55 NEW cov: 11833 ft: 14719 corp: 41/1063b lim: 50 exec/s: 55 rss: 70Mb L: 25/49 MS: 1 ChangeBinInt- 00:07:46.678 [2024-11-27 06:17:16.066800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4294967040 len:1 00:07:46.678 [2024-11-27 06:17:16.066837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.678 [2024-11-27 06:17:16.066949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4294967040 len:1 00:07:46.678 [2024-11-27 06:17:16.066968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.678 [2024-11-27 06:17:16.067089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18417752146830163967 len:65323 00:07:46.678 [2024-11-27 06:17:16.067112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.678 [2024-11-27 06:17:16.067227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446630821797363711 len:65536 00:07:46.678 [2024-11-27 06:17:16.067250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:46.678 #56 NEW cov: 11833 ft: 14748 corp: 42/1105b lim: 50 exec/s: 28 rss: 70Mb L: 42/49 MS: 1 CopyPart- 00:07:46.678 #56 DONE cov: 11833 ft: 14748 corp: 42/1105b lim: 50 exec/s: 28 rss: 70Mb 00:07:46.678 ###### Recommended dictionary. ###### 00:07:46.678 "\000\000\000\000\000\000\000\000" # Uses: 0 00:07:46.678 "\001\000\000\000" # Uses: 3 00:07:46.678 "\017\000\000\000\000\000\000\000" # Uses: 1 00:07:46.678 "\377\377\377\377\377\377\377\000" # Uses: 0 00:07:46.678 ###### End of recommended dictionary. ###### 00:07:46.678 Done 56 runs in 2 second(s) 00:07:46.678 06:17:16 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:07:46.938 06:17:16 -- ../common.sh@72 -- # (( i++ )) 00:07:46.938 06:17:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:46.938 06:17:16 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:07:46.938 06:17:16 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:07:46.938 06:17:16 -- nvmf/run.sh@24 -- # local timen=1 00:07:46.938 06:17:16 -- nvmf/run.sh@25 -- # local core=0x1 00:07:46.938 06:17:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:46.938 06:17:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:07:46.938 06:17:16 -- nvmf/run.sh@29 -- # printf %02d 20 00:07:46.938 06:17:16 -- nvmf/run.sh@29 -- # port=4420 00:07:46.938 06:17:16 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:46.938 06:17:16 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:07:46.938 06:17:16 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:46.938 06:17:16 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:07:46.938 [2024-11-27 06:17:16.244896] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:46.938 [2024-11-27 06:17:16.244965] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid37950 ] 00:07:46.938 EAL: No free 2048 kB hugepages reported on node 1 00:07:46.938 [2024-11-27 06:17:16.421093] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.198 [2024-11-27 06:17:16.484529] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:47.198 [2024-11-27 06:17:16.484678] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.198 [2024-11-27 06:17:16.542439] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:47.198 [2024-11-27 06:17:16.558800] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:07:47.198 INFO: Running with entropic power schedule (0xFF, 100). 00:07:47.198 INFO: Seed: 785560377 00:07:47.198 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:47.198 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:47.198 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:47.198 INFO: A corpus is not provided, starting from an empty corpus 00:07:47.198 #2 INITED exec/s: 0 rss: 61Mb 00:07:47.198 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:47.198 This may also happen if the target rejected all inputs we tried so far 00:07:47.198 [2024-11-27 06:17:16.613796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:47.198 [2024-11-27 06:17:16.613826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.457 NEW_FUNC[1/672]: 0x45c9d8 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:07:47.457 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:47.457 #17 NEW cov: 11664 ft: 11665 corp: 2/35b lim: 90 exec/s: 0 rss: 68Mb L: 34/34 MS: 5 ShuffleBytes-ChangeBit-InsertByte-CopyPart-InsertRepeatedBytes- 00:07:47.457 [2024-11-27 06:17:16.914984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:47.457 [2024-11-27 06:17:16.915022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.457 [2024-11-27 06:17:16.915086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:47.457 [2024-11-27 06:17:16.915107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.458 [2024-11-27 06:17:16.915168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:47.458 [2024-11-27 06:17:16.915187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.458 #23 NEW cov: 11777 ft: 12927 corp: 3/98b lim: 90 exec/s: 0 rss: 68Mb L: 63/63 MS: 1 CopyPart- 00:07:47.458 [2024-11-27 06:17:16.965006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:47.458 [2024-11-27 06:17:16.965034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.458 [2024-11-27 06:17:16.965071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:47.458 [2024-11-27 06:17:16.965086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.458 [2024-11-27 06:17:16.965139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:47.458 [2024-11-27 06:17:16.965154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.458 #24 NEW cov: 11783 ft: 13152 corp: 4/159b lim: 90 exec/s: 0 rss: 68Mb L: 61/63 MS: 1 CopyPart- 00:07:47.717 [2024-11-27 06:17:17.004955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:47.717 [2024-11-27 06:17:17.004984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.717 [2024-11-27 06:17:17.005045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:47.717 [2024-11-27 06:17:17.005061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.717 #30 NEW cov: 11868 ft: 13682 corp: 5/207b lim: 90 exec/s: 0 rss: 68Mb L: 48/63 MS: 1 InsertRepeatedBytes- 00:07:47.717 [2024-11-27 06:17:17.045367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:47.717 [2024-11-27 06:17:17.045395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.717 [2024-11-27 06:17:17.045432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:47.717 [2024-11-27 06:17:17.045447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.717 [2024-11-27 06:17:17.045499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:47.717 [2024-11-27 06:17:17.045514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.717 [2024-11-27 06:17:17.045567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:47.717 [2024-11-27 06:17:17.045582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:47.717 #31 NEW cov: 11868 ft: 14085 corp: 6/284b lim: 90 exec/s: 0 rss: 68Mb L: 77/77 MS: 1 CopyPart- 00:07:47.717 [2024-11-27 06:17:17.085316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:47.717 [2024-11-27 06:17:17.085344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.717 [2024-11-27 06:17:17.085409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:47.717 [2024-11-27 06:17:17.085423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.717 [2024-11-27 06:17:17.085477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:47.718 [2024-11-27 06:17:17.085493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.718 #32 NEW cov: 11868 ft: 14243 corp: 7/345b lim: 90 exec/s: 0 rss: 68Mb L: 61/77 MS: 1 CMP- DE: "\377\377\377\015"- 00:07:47.718 [2024-11-27 06:17:17.125103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:47.718 [2024-11-27 06:17:17.125130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.718 #33 NEW cov: 11868 ft: 14478 corp: 8/380b lim: 90 exec/s: 0 rss: 68Mb L: 35/77 MS: 1 InsertByte- 00:07:47.718 [2024-11-27 06:17:17.165548] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:47.718 [2024-11-27 06:17:17.165574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.718 [2024-11-27 06:17:17.165612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:47.718 [2024-11-27 06:17:17.165627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.718 [2024-11-27 06:17:17.165683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:47.718 [2024-11-27 06:17:17.165697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.718 #34 NEW cov: 11868 ft: 14526 corp: 9/443b lim: 90 exec/s: 0 rss: 68Mb L: 63/77 MS: 1 ChangeBinInt- 00:07:47.718 [2024-11-27 06:17:17.215394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:47.718 [2024-11-27 06:17:17.215420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.718 #35 NEW cov: 11868 ft: 14597 corp: 10/474b lim: 90 exec/s: 0 rss: 69Mb L: 31/77 MS: 1 InsertRepeatedBytes- 00:07:48.019 [2024-11-27 06:17:17.255803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.019 [2024-11-27 06:17:17.255831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.019 [2024-11-27 06:17:17.255867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.019 [2024-11-27 06:17:17.255882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.019 [2024-11-27 06:17:17.255936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.019 [2024-11-27 06:17:17.255951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.019 #36 NEW cov: 11868 ft: 14609 corp: 11/535b lim: 90 exec/s: 0 rss: 69Mb L: 61/77 MS: 1 CopyPart- 00:07:48.019 [2024-11-27 06:17:17.295901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.019 [2024-11-27 06:17:17.295928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.019 [2024-11-27 06:17:17.295995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.019 [2024-11-27 06:17:17.296011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.019 [2024-11-27 06:17:17.296064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.019 [2024-11-27 06:17:17.296078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.019 #37 NEW cov: 11868 ft: 14693 corp: 12/596b lim: 90 exec/s: 0 rss: 69Mb L: 61/77 MS: 1 ShuffleBytes- 00:07:48.019 [2024-11-27 06:17:17.336024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.019 [2024-11-27 06:17:17.336052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.019 [2024-11-27 06:17:17.336091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.019 [2024-11-27 06:17:17.336106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.019 [2024-11-27 06:17:17.336159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.019 [2024-11-27 06:17:17.336174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.019 #38 NEW cov: 11868 ft: 14737 corp: 13/661b lim: 90 exec/s: 0 rss: 69Mb L: 65/77 MS: 1 InsertRepeatedBytes- 00:07:48.019 [2024-11-27 06:17:17.375838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.019 [2024-11-27 06:17:17.375866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.019 #39 NEW cov: 11868 ft: 14792 corp: 14/692b lim: 90 exec/s: 0 rss: 69Mb L: 31/77 MS: 1 CopyPart- 00:07:48.019 [2024-11-27 06:17:17.415971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.019 [2024-11-27 06:17:17.415997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.019 #43 NEW cov: 11868 ft: 14850 corp: 15/721b lim: 90 exec/s: 0 rss: 69Mb L: 29/77 MS: 4 ChangeBit-InsertRepeatedBytes-ChangeBinInt-CrossOver- 00:07:48.019 [2024-11-27 06:17:17.446509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.019 [2024-11-27 06:17:17.446535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.019 [2024-11-27 06:17:17.446592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.019 [2024-11-27 06:17:17.446614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.019 [2024-11-27 06:17:17.446668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.019 [2024-11-27 06:17:17.446684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.019 [2024-11-27 06:17:17.446737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:48.019 [2024-11-27 06:17:17.446762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:48.019 #44 NEW cov: 11868 ft: 14875 corp: 16/802b lim: 90 exec/s: 0 rss: 69Mb L: 81/81 MS: 1 InsertRepeatedBytes- 00:07:48.019 [2024-11-27 06:17:17.496197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.019 [2024-11-27 06:17:17.496223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.337 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:48.337 #45 NEW cov: 11891 ft: 14910 corp: 17/822b lim: 90 exec/s: 0 rss: 69Mb L: 20/81 MS: 1 EraseBytes- 00:07:48.337 [2024-11-27 06:17:17.536624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.337 [2024-11-27 06:17:17.536651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.337 [2024-11-27 06:17:17.536692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.337 [2024-11-27 06:17:17.536708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.337 [2024-11-27 06:17:17.536763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.337 [2024-11-27 06:17:17.536778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.337 #46 NEW cov: 11891 ft: 14921 corp: 18/884b lim: 90 exec/s: 0 rss: 69Mb L: 62/81 MS: 1 InsertByte- 00:07:48.337 [2024-11-27 06:17:17.576764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.337 [2024-11-27 06:17:17.576792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.337 [2024-11-27 06:17:17.576828] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.337 [2024-11-27 06:17:17.576842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.337 [2024-11-27 06:17:17.576896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.337 [2024-11-27 06:17:17.576911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.337 #47 NEW cov: 11891 ft: 14928 corp: 19/946b lim: 90 exec/s: 47 rss: 69Mb L: 62/81 MS: 1 PersAutoDict- DE: "\377\377\377\015"- 00:07:48.337 [2024-11-27 06:17:17.626903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.337 [2024-11-27 06:17:17.626930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.337 [2024-11-27 06:17:17.626967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.337 [2024-11-27 06:17:17.626982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.337 [2024-11-27 06:17:17.627035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.337 [2024-11-27 06:17:17.627049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.337 #48 NEW cov: 11891 ft: 14961 corp: 20/1007b lim: 90 exec/s: 48 rss: 69Mb L: 61/81 MS: 1 CopyPart- 00:07:48.337 [2024-11-27 06:17:17.667193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.337 [2024-11-27 06:17:17.667220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.337 [2024-11-27 06:17:17.667280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.337 [2024-11-27 06:17:17.667296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.337 [2024-11-27 06:17:17.667348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.337 [2024-11-27 06:17:17.667363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.337 [2024-11-27 06:17:17.667419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:48.337 [2024-11-27 06:17:17.667435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:48.337 #49 NEW cov: 11891 ft: 14966 corp: 21/1093b lim: 90 exec/s: 49 rss: 69Mb L: 86/86 MS: 1 InsertRepeatedBytes- 00:07:48.337 [2024-11-27 06:17:17.717289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.337 [2024-11-27 06:17:17.717316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.337 [2024-11-27 06:17:17.717364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.337 [2024-11-27 06:17:17.717379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.337 [2024-11-27 06:17:17.717431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.337 [2024-11-27 06:17:17.717447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.337 [2024-11-27 06:17:17.717499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:48.337 [2024-11-27 06:17:17.717514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:48.337 #50 NEW cov: 11891 ft: 15009 corp: 22/1174b lim: 90 exec/s: 50 rss: 69Mb L: 81/86 MS: 1 CMP- DE: "\003\000"- 00:07:48.337 [2024-11-27 06:17:17.757026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.337 [2024-11-27 06:17:17.757052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.337 #51 NEW cov: 11891 ft: 15040 corp: 23/1208b lim: 90 exec/s: 51 rss: 70Mb L: 34/86 MS: 1 ChangeByte- 00:07:48.337 [2024-11-27 06:17:17.797110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.337 [2024-11-27 06:17:17.797138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.337 #52 NEW cov: 11891 ft: 15136 corp: 24/1228b lim: 90 exec/s: 52 rss: 70Mb L: 20/86 MS: 1 CMP- DE: "q\004\020,\343\177\000\000"- 00:07:48.337 [2024-11-27 06:17:17.837581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.338 [2024-11-27 06:17:17.837664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.338 [2024-11-27 06:17:17.837736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.338 [2024-11-27 06:17:17.837752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.338 [2024-11-27 06:17:17.837806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.338 [2024-11-27 06:17:17.837820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.626 #53 NEW cov: 11891 ft: 15215 corp: 25/1283b lim: 90 exec/s: 53 rss: 70Mb L: 55/86 MS: 1 CrossOver- 00:07:48.626 [2024-11-27 06:17:17.877659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.626 [2024-11-27 06:17:17.877685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.626 [2024-11-27 06:17:17.877722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.626 [2024-11-27 06:17:17.877738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.626 [2024-11-27 06:17:17.877794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.626 [2024-11-27 06:17:17.877809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.626 #54 NEW cov: 11891 ft: 15230 corp: 26/1342b lim: 90 exec/s: 54 rss: 70Mb L: 59/86 MS: 1 PersAutoDict- DE: "\377\377\377\015"- 00:07:48.626 [2024-11-27 06:17:17.927501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.627 [2024-11-27 06:17:17.927527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.627 #55 NEW cov: 11891 ft: 15235 corp: 27/1362b lim: 90 exec/s: 55 rss: 70Mb L: 20/86 MS: 1 ChangeBit- 00:07:48.627 [2024-11-27 06:17:17.968051] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.627 [2024-11-27 06:17:17.968078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.627 [2024-11-27 06:17:17.968125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.627 [2024-11-27 06:17:17.968141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.627 [2024-11-27 06:17:17.968195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.627 [2024-11-27 06:17:17.968211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.627 [2024-11-27 06:17:17.968263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:48.627 [2024-11-27 06:17:17.968278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:48.627 #56 NEW cov: 11891 ft: 15278 corp: 28/1439b lim: 90 exec/s: 56 rss: 70Mb L: 77/86 MS: 1 ShuffleBytes- 00:07:48.627 [2024-11-27 06:17:18.018204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.627 [2024-11-27 06:17:18.018231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.627 [2024-11-27 06:17:18.018270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.627 [2024-11-27 06:17:18.018282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.627 [2024-11-27 06:17:18.018334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.627 [2024-11-27 06:17:18.018349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.627 [2024-11-27 06:17:18.018400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:48.627 [2024-11-27 06:17:18.018415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:48.627 #57 NEW cov: 11891 ft: 15280 corp: 29/1525b lim: 90 exec/s: 57 rss: 70Mb L: 86/86 MS: 1 ChangeByte- 00:07:48.627 [2024-11-27 06:17:18.058173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.627 [2024-11-27 06:17:18.058200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.627 [2024-11-27 06:17:18.058238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.627 [2024-11-27 06:17:18.058253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.627 [2024-11-27 06:17:18.058305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.627 [2024-11-27 06:17:18.058323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.627 #58 NEW cov: 11891 ft: 15299 corp: 30/1586b lim: 90 exec/s: 58 rss: 70Mb L: 61/86 MS: 1 PersAutoDict- DE: "\377\377\377\015"- 00:07:48.627 [2024-11-27 06:17:18.097950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.627 [2024-11-27 06:17:18.097976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.627 #59 NEW cov: 11891 ft: 15307 corp: 31/1606b lim: 90 exec/s: 59 rss: 70Mb L: 20/86 MS: 1 PersAutoDict- DE: "\377\377\377\015"- 00:07:48.627 [2024-11-27 06:17:18.138360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.627 [2024-11-27 06:17:18.138388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.627 [2024-11-27 06:17:18.138438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.627 [2024-11-27 06:17:18.138454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.627 [2024-11-27 06:17:18.138508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.627 [2024-11-27 06:17:18.138523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.886 #60 NEW cov: 11891 ft: 15324 corp: 32/1671b lim: 90 exec/s: 60 rss: 70Mb L: 65/86 MS: 1 ChangeByte- 00:07:48.886 [2024-11-27 06:17:18.178341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.886 [2024-11-27 06:17:18.178367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.886 [2024-11-27 06:17:18.178431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.886 [2024-11-27 06:17:18.178447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.886 #61 NEW cov: 11891 ft: 15333 corp: 33/1707b lim: 90 exec/s: 61 rss: 70Mb L: 36/86 MS: 1 InsertByte- 00:07:48.886 [2024-11-27 06:17:18.218723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.886 [2024-11-27 06:17:18.218750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.886 [2024-11-27 06:17:18.218795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.886 [2024-11-27 06:17:18.218810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.886 [2024-11-27 06:17:18.218863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.886 [2024-11-27 06:17:18.218877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.886 [2024-11-27 06:17:18.218931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:48.886 [2024-11-27 06:17:18.218946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:48.887 #62 NEW cov: 11891 ft: 15345 corp: 34/1790b lim: 90 exec/s: 62 rss: 70Mb L: 83/86 MS: 1 CopyPart- 00:07:48.887 [2024-11-27 06:17:18.258435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.887 [2024-11-27 06:17:18.258462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.887 #63 NEW cov: 11891 ft: 15356 corp: 35/1810b lim: 90 exec/s: 63 rss: 70Mb L: 20/86 MS: 1 CMP- DE: "\337\347\017,\343\177\000\000"- 00:07:48.887 [2024-11-27 06:17:18.298957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.887 [2024-11-27 06:17:18.298984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.887 [2024-11-27 06:17:18.299031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.887 [2024-11-27 06:17:18.299046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.887 [2024-11-27 06:17:18.299099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.887 [2024-11-27 06:17:18.299115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.887 [2024-11-27 06:17:18.299169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:48.887 [2024-11-27 06:17:18.299184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:48.887 #64 NEW cov: 11891 ft: 15373 corp: 36/1898b lim: 90 exec/s: 64 rss: 70Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:07:48.887 [2024-11-27 06:17:18.338941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.887 [2024-11-27 06:17:18.338967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.887 [2024-11-27 06:17:18.339003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.887 [2024-11-27 06:17:18.339019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.887 [2024-11-27 06:17:18.339072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.887 [2024-11-27 06:17:18.339085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.887 #65 NEW cov: 11891 ft: 15419 corp: 37/1960b lim: 90 exec/s: 65 rss: 70Mb L: 62/88 MS: 1 ChangeByte- 00:07:48.887 [2024-11-27 06:17:18.379185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.887 [2024-11-27 06:17:18.379212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.887 [2024-11-27 06:17:18.379251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.887 [2024-11-27 06:17:18.379266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.887 [2024-11-27 06:17:18.379320] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.887 [2024-11-27 06:17:18.379336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.887 [2024-11-27 06:17:18.379390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:48.887 [2024-11-27 06:17:18.379405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:48.887 #66 NEW cov: 11891 ft: 15435 corp: 38/2043b lim: 90 exec/s: 66 rss: 70Mb L: 83/88 MS: 1 ChangeBit- 00:07:48.887 [2024-11-27 06:17:18.419160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:48.887 [2024-11-27 06:17:18.419189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.887 [2024-11-27 06:17:18.419225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:48.887 [2024-11-27 06:17:18.419246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.887 [2024-11-27 06:17:18.419302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:48.887 [2024-11-27 06:17:18.419317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.146 #67 NEW cov: 11891 ft: 15448 corp: 39/2106b lim: 90 exec/s: 67 rss: 70Mb L: 63/88 MS: 1 ShuffleBytes- 00:07:49.146 [2024-11-27 06:17:18.458989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:49.146 [2024-11-27 06:17:18.459017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.146 #68 NEW cov: 11891 ft: 15478 corp: 40/2139b lim: 90 exec/s: 68 rss: 70Mb L: 33/88 MS: 1 EraseBytes- 00:07:49.146 [2024-11-27 06:17:18.499405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:49.146 [2024-11-27 06:17:18.499433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.146 [2024-11-27 06:17:18.499497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:49.146 [2024-11-27 06:17:18.499512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.146 [2024-11-27 06:17:18.499567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:49.146 [2024-11-27 06:17:18.499583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.146 #69 NEW cov: 11891 ft: 15490 corp: 41/2209b lim: 90 exec/s: 69 rss: 70Mb L: 70/88 MS: 1 PersAutoDict- DE: "\337\347\017,\343\177\000\000"- 00:07:49.146 [2024-11-27 06:17:18.539527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:49.146 [2024-11-27 06:17:18.539554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.146 [2024-11-27 06:17:18.539612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:49.146 [2024-11-27 06:17:18.539628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.146 [2024-11-27 06:17:18.539684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:49.146 [2024-11-27 06:17:18.539699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.146 #70 NEW cov: 11891 ft: 15496 corp: 42/2269b lim: 90 exec/s: 70 rss: 70Mb L: 60/88 MS: 1 InsertByte- 00:07:49.146 [2024-11-27 06:17:18.579674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:49.146 [2024-11-27 06:17:18.579701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.147 [2024-11-27 06:17:18.579774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:49.147 [2024-11-27 06:17:18.579790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.147 [2024-11-27 06:17:18.579845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:49.147 [2024-11-27 06:17:18.579861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.147 #71 NEW cov: 11891 ft: 15531 corp: 43/2328b lim: 90 exec/s: 35 rss: 70Mb L: 59/88 MS: 1 ShuffleBytes- 00:07:49.147 #71 DONE cov: 11891 ft: 15531 corp: 43/2328b lim: 90 exec/s: 35 rss: 70Mb 00:07:49.147 ###### Recommended dictionary. ###### 00:07:49.147 "\377\377\377\015" # Uses: 4 00:07:49.147 "\003\000" # Uses: 0 00:07:49.147 "q\004\020,\343\177\000\000" # Uses: 0 00:07:49.147 "\337\347\017,\343\177\000\000" # Uses: 1 00:07:49.147 ###### End of recommended dictionary. ###### 00:07:49.147 Done 71 runs in 2 second(s) 00:07:49.407 06:17:18 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:07:49.407 06:17:18 -- ../common.sh@72 -- # (( i++ )) 00:07:49.407 06:17:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:49.407 06:17:18 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:07:49.407 06:17:18 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:07:49.407 06:17:18 -- nvmf/run.sh@24 -- # local timen=1 00:07:49.407 06:17:18 -- nvmf/run.sh@25 -- # local core=0x1 00:07:49.407 06:17:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:49.407 06:17:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:07:49.407 06:17:18 -- nvmf/run.sh@29 -- # printf %02d 21 00:07:49.407 06:17:18 -- nvmf/run.sh@29 -- # port=4421 00:07:49.407 06:17:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:49.407 06:17:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:07:49.407 06:17:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.407 06:17:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:07:49.407 [2024-11-27 06:17:18.756773] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:49.407 [2024-11-27 06:17:18.756863] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid38377 ] 00:07:49.407 EAL: No free 2048 kB hugepages reported on node 1 00:07:49.407 [2024-11-27 06:17:18.940262] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.667 [2024-11-27 06:17:19.006851] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:49.667 [2024-11-27 06:17:19.006997] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.667 [2024-11-27 06:17:19.064997] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:49.667 [2024-11-27 06:17:19.081347] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:07:49.667 INFO: Running with entropic power schedule (0xFF, 100). 00:07:49.667 INFO: Seed: 3307552003 00:07:49.667 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:49.667 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:49.667 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:49.667 INFO: A corpus is not provided, starting from an empty corpus 00:07:49.667 #2 INITED exec/s: 0 rss: 60Mb 00:07:49.667 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:49.667 This may also happen if the target rejected all inputs we tried so far 00:07:49.667 [2024-11-27 06:17:19.146634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:49.667 [2024-11-27 06:17:19.146677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.924 NEW_FUNC[1/672]: 0x45fc08 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:07:49.924 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:49.924 #5 NEW cov: 11632 ft: 11640 corp: 2/19b lim: 50 exec/s: 0 rss: 68Mb L: 18/18 MS: 3 ChangeBit-CrossOver-InsertRepeatedBytes- 00:07:50.183 [2024-11-27 06:17:19.468275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.183 [2024-11-27 06:17:19.468331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.183 [2024-11-27 06:17:19.468474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:50.183 [2024-11-27 06:17:19.468505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.183 #8 NEW cov: 11752 ft: 13094 corp: 3/48b lim: 50 exec/s: 0 rss: 68Mb L: 29/29 MS: 3 ChangeByte-CrossOver-InsertRepeatedBytes- 00:07:50.183 [2024-11-27 06:17:19.508058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.183 [2024-11-27 06:17:19.508083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.183 #20 NEW cov: 11758 ft: 13292 corp: 4/67b lim: 50 exec/s: 0 rss: 68Mb L: 19/29 MS: 2 ChangeBinInt-CrossOver- 00:07:50.183 [2024-11-27 06:17:19.548177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.183 [2024-11-27 06:17:19.548212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.183 #21 NEW cov: 11843 ft: 13511 corp: 5/85b lim: 50 exec/s: 0 rss: 68Mb L: 18/29 MS: 1 CrossOver- 00:07:50.183 [2024-11-27 06:17:19.588618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.183 [2024-11-27 06:17:19.588657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.183 [2024-11-27 06:17:19.588763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:50.183 [2024-11-27 06:17:19.588782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.183 #22 NEW cov: 11843 ft: 13660 corp: 6/110b lim: 50 exec/s: 0 rss: 68Mb L: 25/29 MS: 1 EraseBytes- 00:07:50.183 [2024-11-27 06:17:19.628712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.183 [2024-11-27 06:17:19.628737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.183 [2024-11-27 06:17:19.628862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:50.183 [2024-11-27 06:17:19.628882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.183 #23 NEW cov: 11843 ft: 13740 corp: 7/139b lim: 50 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 ChangeBit- 00:07:50.183 [2024-11-27 06:17:19.668573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.183 [2024-11-27 06:17:19.668603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.183 #24 NEW cov: 11843 ft: 13835 corp: 8/158b lim: 50 exec/s: 0 rss: 68Mb L: 19/29 MS: 1 CrossOver- 00:07:50.183 [2024-11-27 06:17:19.708925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.183 [2024-11-27 06:17:19.708951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.183 [2024-11-27 06:17:19.709082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:50.183 [2024-11-27 06:17:19.709104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.443 #25 NEW cov: 11843 ft: 13931 corp: 9/187b lim: 50 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 ChangeBit- 00:07:50.443 [2024-11-27 06:17:19.748777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.443 [2024-11-27 06:17:19.748802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.443 #26 NEW cov: 11843 ft: 13944 corp: 10/205b lim: 50 exec/s: 0 rss: 68Mb L: 18/29 MS: 1 ChangeByte- 00:07:50.443 [2024-11-27 06:17:19.788817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.443 [2024-11-27 06:17:19.788847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.443 #27 NEW cov: 11843 ft: 13965 corp: 11/224b lim: 50 exec/s: 0 rss: 68Mb L: 19/29 MS: 1 InsertByte- 00:07:50.443 [2024-11-27 06:17:19.829544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.443 [2024-11-27 06:17:19.829574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.443 [2024-11-27 06:17:19.829696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:50.443 [2024-11-27 06:17:19.829721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.443 [2024-11-27 06:17:19.829839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:50.443 [2024-11-27 06:17:19.829861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.443 #28 NEW cov: 11843 ft: 14322 corp: 12/254b lim: 50 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:50.443 [2024-11-27 06:17:19.879493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.443 [2024-11-27 06:17:19.879520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.443 [2024-11-27 06:17:19.879648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:50.443 [2024-11-27 06:17:19.879672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.443 #29 NEW cov: 11843 ft: 14341 corp: 13/281b lim: 50 exec/s: 0 rss: 68Mb L: 27/30 MS: 1 EraseBytes- 00:07:50.443 [2024-11-27 06:17:19.919329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.443 [2024-11-27 06:17:19.919354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.443 #30 NEW cov: 11843 ft: 14383 corp: 14/299b lim: 50 exec/s: 0 rss: 68Mb L: 18/30 MS: 1 ChangeBinInt- 00:07:50.443 [2024-11-27 06:17:19.959710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.443 [2024-11-27 06:17:19.959742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.443 [2024-11-27 06:17:19.959858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:50.443 [2024-11-27 06:17:19.959877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.702 #31 NEW cov: 11843 ft: 14397 corp: 15/328b lim: 50 exec/s: 0 rss: 68Mb L: 29/30 MS: 1 ShuffleBytes- 00:07:50.702 [2024-11-27 06:17:20.010171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.702 [2024-11-27 06:17:20.010198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.702 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:50.702 #32 NEW cov: 11866 ft: 14426 corp: 16/346b lim: 50 exec/s: 0 rss: 69Mb L: 18/30 MS: 1 ShuffleBytes- 00:07:50.702 [2024-11-27 06:17:20.059832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.702 [2024-11-27 06:17:20.059866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.702 #33 NEW cov: 11866 ft: 14467 corp: 17/365b lim: 50 exec/s: 0 rss: 69Mb L: 19/30 MS: 1 ShuffleBytes- 00:07:50.702 [2024-11-27 06:17:20.100709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.702 [2024-11-27 06:17:20.100753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.702 [2024-11-27 06:17:20.100840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:50.702 [2024-11-27 06:17:20.100864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.702 [2024-11-27 06:17:20.100976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:50.702 [2024-11-27 06:17:20.101002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.702 [2024-11-27 06:17:20.101121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:50.702 [2024-11-27 06:17:20.101146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.702 #34 NEW cov: 11866 ft: 14866 corp: 18/410b lim: 50 exec/s: 0 rss: 69Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:07:50.702 [2024-11-27 06:17:20.140549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.702 [2024-11-27 06:17:20.140579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.702 [2024-11-27 06:17:20.140708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:50.702 [2024-11-27 06:17:20.140729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.702 [2024-11-27 06:17:20.140843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:50.702 [2024-11-27 06:17:20.140864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.702 [2024-11-27 06:17:20.140981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:50.702 [2024-11-27 06:17:20.141004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.702 #35 NEW cov: 11866 ft: 14948 corp: 19/454b lim: 50 exec/s: 35 rss: 69Mb L: 44/45 MS: 1 InsertRepeatedBytes- 00:07:50.702 [2024-11-27 06:17:20.190658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.702 [2024-11-27 06:17:20.190691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.703 [2024-11-27 06:17:20.190801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:50.703 [2024-11-27 06:17:20.190821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.703 [2024-11-27 06:17:20.190937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:50.703 [2024-11-27 06:17:20.190960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.703 #36 NEW cov: 11866 ft: 14982 corp: 20/484b lim: 50 exec/s: 36 rss: 69Mb L: 30/45 MS: 1 InsertByte- 00:07:50.703 [2024-11-27 06:17:20.230241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.703 [2024-11-27 06:17:20.230268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.962 #37 NEW cov: 11866 ft: 15075 corp: 21/501b lim: 50 exec/s: 37 rss: 69Mb L: 17/45 MS: 1 EraseBytes- 00:07:50.962 [2024-11-27 06:17:20.280665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.962 [2024-11-27 06:17:20.280698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.962 [2024-11-27 06:17:20.280819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:50.962 [2024-11-27 06:17:20.280843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.962 #38 NEW cov: 11866 ft: 15087 corp: 22/529b lim: 50 exec/s: 38 rss: 69Mb L: 28/45 MS: 1 InsertRepeatedBytes- 00:07:50.962 [2024-11-27 06:17:20.320560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.962 [2024-11-27 06:17:20.320586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.962 #39 NEW cov: 11866 ft: 15133 corp: 23/548b lim: 50 exec/s: 39 rss: 69Mb L: 19/45 MS: 1 ChangeBit- 00:07:50.962 [2024-11-27 06:17:20.360689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.962 [2024-11-27 06:17:20.360716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.962 #40 NEW cov: 11866 ft: 15173 corp: 24/567b lim: 50 exec/s: 40 rss: 69Mb L: 19/45 MS: 1 ChangeBit- 00:07:50.962 [2024-11-27 06:17:20.400805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.962 [2024-11-27 06:17:20.400832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.962 #41 NEW cov: 11866 ft: 15180 corp: 25/585b lim: 50 exec/s: 41 rss: 69Mb L: 18/45 MS: 1 CopyPart- 00:07:50.962 [2024-11-27 06:17:20.440501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:50.962 [2024-11-27 06:17:20.440528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.962 #42 NEW cov: 11866 ft: 15194 corp: 26/603b lim: 50 exec/s: 42 rss: 69Mb L: 18/45 MS: 1 CrossOver- 00:07:51.222 [2024-11-27 06:17:20.501116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.222 [2024-11-27 06:17:20.501143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.222 #43 NEW cov: 11866 ft: 15295 corp: 27/622b lim: 50 exec/s: 43 rss: 69Mb L: 19/45 MS: 1 CrossOver- 00:07:51.222 [2024-11-27 06:17:20.551957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.222 [2024-11-27 06:17:20.551989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.222 [2024-11-27 06:17:20.552069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:51.222 [2024-11-27 06:17:20.552092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.222 [2024-11-27 06:17:20.552206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:51.222 [2024-11-27 06:17:20.552228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.222 [2024-11-27 06:17:20.552348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:51.222 [2024-11-27 06:17:20.552368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.222 #44 NEW cov: 11866 ft: 15349 corp: 28/670b lim: 50 exec/s: 44 rss: 69Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:07:51.222 [2024-11-27 06:17:20.601361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.222 [2024-11-27 06:17:20.601389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.222 #45 NEW cov: 11866 ft: 15356 corp: 29/689b lim: 50 exec/s: 45 rss: 69Mb L: 19/48 MS: 1 ChangeByte- 00:07:51.222 [2024-11-27 06:17:20.641514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.222 [2024-11-27 06:17:20.641539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.222 #46 NEW cov: 11866 ft: 15373 corp: 30/708b lim: 50 exec/s: 46 rss: 69Mb L: 19/48 MS: 1 ChangeByte- 00:07:51.222 [2024-11-27 06:17:20.681590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.222 [2024-11-27 06:17:20.681620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.222 #47 NEW cov: 11866 ft: 15444 corp: 31/727b lim: 50 exec/s: 47 rss: 69Mb L: 19/48 MS: 1 CMP- DE: "\003\000\000\000"- 00:07:51.222 [2024-11-27 06:17:20.721931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.222 [2024-11-27 06:17:20.721964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.222 [2024-11-27 06:17:20.722087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:51.222 [2024-11-27 06:17:20.722108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.222 #48 NEW cov: 11866 ft: 15445 corp: 32/752b lim: 50 exec/s: 48 rss: 69Mb L: 25/48 MS: 1 CrossOver- 00:07:51.481 [2024-11-27 06:17:20.762587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.481 [2024-11-27 06:17:20.762621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.481 [2024-11-27 06:17:20.762730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:51.481 [2024-11-27 06:17:20.762753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.481 [2024-11-27 06:17:20.762867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:51.481 [2024-11-27 06:17:20.762892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.481 [2024-11-27 06:17:20.763014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:51.481 [2024-11-27 06:17:20.763035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.481 #49 NEW cov: 11866 ft: 15459 corp: 33/796b lim: 50 exec/s: 49 rss: 70Mb L: 44/48 MS: 1 CrossOver- 00:07:51.481 [2024-11-27 06:17:20.811980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.481 [2024-11-27 06:17:20.812005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.481 #50 NEW cov: 11866 ft: 15462 corp: 34/815b lim: 50 exec/s: 50 rss: 70Mb L: 19/48 MS: 1 ChangeBit- 00:07:51.481 [2024-11-27 06:17:20.852406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.481 [2024-11-27 06:17:20.852432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.481 [2024-11-27 06:17:20.852561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:51.481 [2024-11-27 06:17:20.852583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.481 #51 NEW cov: 11866 ft: 15468 corp: 35/843b lim: 50 exec/s: 51 rss: 70Mb L: 28/48 MS: 1 CopyPart- 00:07:51.481 [2024-11-27 06:17:20.892237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.481 [2024-11-27 06:17:20.892268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.481 #52 NEW cov: 11866 ft: 15478 corp: 36/862b lim: 50 exec/s: 52 rss: 70Mb L: 19/48 MS: 1 CrossOver- 00:07:51.481 [2024-11-27 06:17:20.932496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.481 [2024-11-27 06:17:20.932521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.481 #53 NEW cov: 11866 ft: 15484 corp: 37/880b lim: 50 exec/s: 53 rss: 70Mb L: 18/48 MS: 1 CopyPart- 00:07:51.481 [2024-11-27 06:17:20.972618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.481 [2024-11-27 06:17:20.972643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.481 #54 NEW cov: 11866 ft: 15547 corp: 38/897b lim: 50 exec/s: 54 rss: 70Mb L: 17/48 MS: 1 ShuffleBytes- 00:07:51.482 [2024-11-27 06:17:21.012832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.482 [2024-11-27 06:17:21.012857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.741 #55 NEW cov: 11866 ft: 15559 corp: 39/915b lim: 50 exec/s: 55 rss: 70Mb L: 18/48 MS: 1 CrossOver- 00:07:51.741 [2024-11-27 06:17:21.052942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.741 [2024-11-27 06:17:21.052970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.741 #56 NEW cov: 11866 ft: 15577 corp: 40/934b lim: 50 exec/s: 56 rss: 70Mb L: 19/48 MS: 1 ChangeByte- 00:07:51.741 [2024-11-27 06:17:21.092993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.741 [2024-11-27 06:17:21.093022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.741 #57 NEW cov: 11866 ft: 15582 corp: 41/953b lim: 50 exec/s: 57 rss: 70Mb L: 19/48 MS: 1 ChangeBinInt- 00:07:51.741 [2024-11-27 06:17:21.133847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:51.741 [2024-11-27 06:17:21.133878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.741 [2024-11-27 06:17:21.133991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:51.741 [2024-11-27 06:17:21.134009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.741 [2024-11-27 06:17:21.134126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:51.741 [2024-11-27 06:17:21.134149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.741 [2024-11-27 06:17:21.134264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:51.741 [2024-11-27 06:17:21.134286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.741 #58 NEW cov: 11866 ft: 15597 corp: 42/1001b lim: 50 exec/s: 29 rss: 70Mb L: 48/48 MS: 1 CopyPart- 00:07:51.741 #58 DONE cov: 11866 ft: 15597 corp: 42/1001b lim: 50 exec/s: 29 rss: 70Mb 00:07:51.741 ###### Recommended dictionary. ###### 00:07:51.741 "\003\000\000\000" # Uses: 0 00:07:51.741 ###### End of recommended dictionary. ###### 00:07:51.741 Done 58 runs in 2 second(s) 00:07:52.000 06:17:21 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:07:52.000 06:17:21 -- ../common.sh@72 -- # (( i++ )) 00:07:52.000 06:17:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:52.000 06:17:21 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:07:52.000 06:17:21 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:07:52.000 06:17:21 -- nvmf/run.sh@24 -- # local timen=1 00:07:52.000 06:17:21 -- nvmf/run.sh@25 -- # local core=0x1 00:07:52.000 06:17:21 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:52.000 06:17:21 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:07:52.000 06:17:21 -- nvmf/run.sh@29 -- # printf %02d 22 00:07:52.000 06:17:21 -- nvmf/run.sh@29 -- # port=4422 00:07:52.000 06:17:21 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:52.000 06:17:21 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:07:52.000 06:17:21 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:52.000 06:17:21 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:07:52.000 [2024-11-27 06:17:21.300687] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:52.000 [2024-11-27 06:17:21.300739] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid38784 ] 00:07:52.000 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.000 [2024-11-27 06:17:21.477540] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.259 [2024-11-27 06:17:21.541251] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.259 [2024-11-27 06:17:21.541394] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.259 [2024-11-27 06:17:21.599700] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.259 [2024-11-27 06:17:21.616042] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:07:52.259 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.259 INFO: Seed: 1546590696 00:07:52.259 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:52.259 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:52.259 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:52.259 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.259 #2 INITED exec/s: 0 rss: 61Mb 00:07:52.259 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:52.259 This may also happen if the target rejected all inputs we tried so far 00:07:52.259 [2024-11-27 06:17:21.682540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:52.259 [2024-11-27 06:17:21.682584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.259 [2024-11-27 06:17:21.682736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:52.259 [2024-11-27 06:17:21.682758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.259 [2024-11-27 06:17:21.682874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:52.259 [2024-11-27 06:17:21.682896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.259 [2024-11-27 06:17:21.683016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:52.259 [2024-11-27 06:17:21.683038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.519 NEW_FUNC[1/672]: 0x461ed8 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:07:52.519 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:52.519 #7 NEW cov: 11665 ft: 11666 corp: 2/76b lim: 85 exec/s: 0 rss: 68Mb L: 75/75 MS: 5 CopyPart-InsertByte-InsertRepeatedBytes-ChangeBit-InsertRepeatedBytes- 00:07:52.519 [2024-11-27 06:17:22.023390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:52.519 [2024-11-27 06:17:22.023432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.519 [2024-11-27 06:17:22.023558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:52.519 [2024-11-27 06:17:22.023581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.519 [2024-11-27 06:17:22.023696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:52.519 [2024-11-27 06:17:22.023722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.519 [2024-11-27 06:17:22.023840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:52.519 [2024-11-27 06:17:22.023862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.519 #8 NEW cov: 11778 ft: 12274 corp: 3/152b lim: 85 exec/s: 0 rss: 68Mb L: 76/76 MS: 1 InsertByte- 00:07:52.778 [2024-11-27 06:17:22.073330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:52.778 [2024-11-27 06:17:22.073362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.778 [2024-11-27 06:17:22.073487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:52.778 [2024-11-27 06:17:22.073510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.778 [2024-11-27 06:17:22.073632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:52.778 [2024-11-27 06:17:22.073654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.778 [2024-11-27 06:17:22.073771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:52.778 [2024-11-27 06:17:22.073796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.778 #9 NEW cov: 11784 ft: 12713 corp: 4/235b lim: 85 exec/s: 0 rss: 68Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:07:52.778 [2024-11-27 06:17:22.113478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:52.778 [2024-11-27 06:17:22.113510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.778 [2024-11-27 06:17:22.113617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:52.778 [2024-11-27 06:17:22.113653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.778 [2024-11-27 06:17:22.113764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:52.778 [2024-11-27 06:17:22.113785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.778 [2024-11-27 06:17:22.113899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:52.778 [2024-11-27 06:17:22.113917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.778 #10 NEW cov: 11869 ft: 12955 corp: 5/312b lim: 85 exec/s: 0 rss: 68Mb L: 77/83 MS: 1 InsertByte- 00:07:52.778 [2024-11-27 06:17:22.153554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:52.778 [2024-11-27 06:17:22.153584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.778 [2024-11-27 06:17:22.153709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:52.778 [2024-11-27 06:17:22.153731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.778 [2024-11-27 06:17:22.153841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:52.778 [2024-11-27 06:17:22.153861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.778 [2024-11-27 06:17:22.153981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:52.779 [2024-11-27 06:17:22.154003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.779 #11 NEW cov: 11869 ft: 13054 corp: 6/395b lim: 85 exec/s: 0 rss: 69Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:07:52.779 [2024-11-27 06:17:22.193669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:52.779 [2024-11-27 06:17:22.193699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.779 [2024-11-27 06:17:22.193807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:52.779 [2024-11-27 06:17:22.193824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.779 [2024-11-27 06:17:22.193932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:52.779 [2024-11-27 06:17:22.193953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.779 [2024-11-27 06:17:22.194069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:52.779 [2024-11-27 06:17:22.194088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.779 #12 NEW cov: 11869 ft: 13157 corp: 7/472b lim: 85 exec/s: 0 rss: 69Mb L: 77/83 MS: 1 ChangeBit- 00:07:52.779 [2024-11-27 06:17:22.233843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:52.779 [2024-11-27 06:17:22.233875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.779 [2024-11-27 06:17:22.233946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:52.779 [2024-11-27 06:17:22.233967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.779 [2024-11-27 06:17:22.234082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:52.779 [2024-11-27 06:17:22.234100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.779 [2024-11-27 06:17:22.234216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:52.779 [2024-11-27 06:17:22.234235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.779 #13 NEW cov: 11869 ft: 13183 corp: 8/555b lim: 85 exec/s: 0 rss: 69Mb L: 83/83 MS: 1 CopyPart- 00:07:52.779 [2024-11-27 06:17:22.273929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:52.779 [2024-11-27 06:17:22.273960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.779 [2024-11-27 06:17:22.274060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:52.779 [2024-11-27 06:17:22.274079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.779 [2024-11-27 06:17:22.274196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:52.779 [2024-11-27 06:17:22.274217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.779 [2024-11-27 06:17:22.274330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:52.779 [2024-11-27 06:17:22.274352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.779 #14 NEW cov: 11869 ft: 13223 corp: 9/632b lim: 85 exec/s: 0 rss: 69Mb L: 77/83 MS: 1 CopyPart- 00:07:53.038 [2024-11-27 06:17:22.314117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.038 [2024-11-27 06:17:22.314147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.038 [2024-11-27 06:17:22.314221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.038 [2024-11-27 06:17:22.314241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.038 [2024-11-27 06:17:22.314358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.038 [2024-11-27 06:17:22.314380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.038 [2024-11-27 06:17:22.314489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.038 [2024-11-27 06:17:22.314512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.038 #15 NEW cov: 11869 ft: 13262 corp: 10/709b lim: 85 exec/s: 0 rss: 69Mb L: 77/83 MS: 1 ChangeBinInt- 00:07:53.038 [2024-11-27 06:17:22.354237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.038 [2024-11-27 06:17:22.354266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.038 [2024-11-27 06:17:22.354356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.038 [2024-11-27 06:17:22.354376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.038 [2024-11-27 06:17:22.354490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.038 [2024-11-27 06:17:22.354511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.038 [2024-11-27 06:17:22.354627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.038 [2024-11-27 06:17:22.354649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.038 #16 NEW cov: 11869 ft: 13293 corp: 11/785b lim: 85 exec/s: 0 rss: 69Mb L: 76/83 MS: 1 ChangeBinInt- 00:07:53.038 [2024-11-27 06:17:22.394374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.038 [2024-11-27 06:17:22.394407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.038 [2024-11-27 06:17:22.394504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.038 [2024-11-27 06:17:22.394525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.394635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.039 [2024-11-27 06:17:22.394653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.394771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.039 [2024-11-27 06:17:22.394792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.039 #17 NEW cov: 11869 ft: 13328 corp: 12/860b lim: 85 exec/s: 0 rss: 69Mb L: 75/83 MS: 1 EraseBytes- 00:07:53.039 [2024-11-27 06:17:22.434006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.039 [2024-11-27 06:17:22.434036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.434153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.039 [2024-11-27 06:17:22.434176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.039 #18 NEW cov: 11869 ft: 13747 corp: 13/898b lim: 85 exec/s: 0 rss: 69Mb L: 38/83 MS: 1 EraseBytes- 00:07:53.039 [2024-11-27 06:17:22.474575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.039 [2024-11-27 06:17:22.474611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.474698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.039 [2024-11-27 06:17:22.474720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.474830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.039 [2024-11-27 06:17:22.474851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.474966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.039 [2024-11-27 06:17:22.474987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.039 #19 NEW cov: 11869 ft: 13762 corp: 14/976b lim: 85 exec/s: 0 rss: 69Mb L: 78/83 MS: 1 InsertByte- 00:07:53.039 [2024-11-27 06:17:22.514811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.039 [2024-11-27 06:17:22.514844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.514960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.039 [2024-11-27 06:17:22.514980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.515093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.039 [2024-11-27 06:17:22.515115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.515232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.039 [2024-11-27 06:17:22.515253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.039 #20 NEW cov: 11869 ft: 13798 corp: 15/1059b lim: 85 exec/s: 0 rss: 69Mb L: 83/83 MS: 1 CopyPart- 00:07:53.039 [2024-11-27 06:17:22.555154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.039 [2024-11-27 06:17:22.555184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.555296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.039 [2024-11-27 06:17:22.555331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.555445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.039 [2024-11-27 06:17:22.555465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.555579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.039 [2024-11-27 06:17:22.555605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.039 [2024-11-27 06:17:22.555733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:53.039 [2024-11-27 06:17:22.555756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:53.299 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:53.299 #21 NEW cov: 11892 ft: 13883 corp: 16/1144b lim: 85 exec/s: 0 rss: 69Mb L: 85/85 MS: 1 CMP- DE: "\002\000"- 00:07:53.299 [2024-11-27 06:17:22.604856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.299 [2024-11-27 06:17:22.604885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.299 [2024-11-27 06:17:22.604999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.299 [2024-11-27 06:17:22.605020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.299 [2024-11-27 06:17:22.605131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.299 [2024-11-27 06:17:22.605155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.299 #22 NEW cov: 11892 ft: 14169 corp: 17/1197b lim: 85 exec/s: 0 rss: 69Mb L: 53/85 MS: 1 EraseBytes- 00:07:53.299 [2024-11-27 06:17:22.645202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.299 [2024-11-27 06:17:22.645235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.299 [2024-11-27 06:17:22.645327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.299 [2024-11-27 06:17:22.645345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.299 [2024-11-27 06:17:22.645457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.299 [2024-11-27 06:17:22.645471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.299 [2024-11-27 06:17:22.645594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.299 [2024-11-27 06:17:22.645621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.299 #23 NEW cov: 11892 ft: 14199 corp: 18/1274b lim: 85 exec/s: 23 rss: 69Mb L: 77/85 MS: 1 InsertByte- 00:07:53.299 [2024-11-27 06:17:22.685561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.299 [2024-11-27 06:17:22.685594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.299 [2024-11-27 06:17:22.685662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.299 [2024-11-27 06:17:22.685682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.299 [2024-11-27 06:17:22.685792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.299 [2024-11-27 06:17:22.685815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.299 [2024-11-27 06:17:22.685935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.299 [2024-11-27 06:17:22.685957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.299 [2024-11-27 06:17:22.686079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:53.299 [2024-11-27 06:17:22.686098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:53.299 #24 NEW cov: 11892 ft: 14215 corp: 19/1359b lim: 85 exec/s: 24 rss: 69Mb L: 85/85 MS: 1 ShuffleBytes- 00:07:53.299 [2024-11-27 06:17:22.735633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.299 [2024-11-27 06:17:22.735666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.299 [2024-11-27 06:17:22.735747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.299 [2024-11-27 06:17:22.735768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.299 [2024-11-27 06:17:22.735884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.299 [2024-11-27 06:17:22.735907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.299 [2024-11-27 06:17:22.736024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.299 [2024-11-27 06:17:22.736046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.300 [2024-11-27 06:17:22.736160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:53.300 [2024-11-27 06:17:22.736183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:53.300 #30 NEW cov: 11892 ft: 14227 corp: 20/1444b lim: 85 exec/s: 30 rss: 69Mb L: 85/85 MS: 1 CrossOver- 00:07:53.300 [2024-11-27 06:17:22.775817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.300 [2024-11-27 06:17:22.775848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.300 [2024-11-27 06:17:22.775940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.300 [2024-11-27 06:17:22.775963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.300 [2024-11-27 06:17:22.776077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.300 [2024-11-27 06:17:22.776099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.300 [2024-11-27 06:17:22.776213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.300 [2024-11-27 06:17:22.776235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.300 [2024-11-27 06:17:22.776351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:53.300 [2024-11-27 06:17:22.776371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:53.300 #31 NEW cov: 11892 ft: 14268 corp: 21/1529b lim: 85 exec/s: 31 rss: 70Mb L: 85/85 MS: 1 ChangeByte- 00:07:53.300 [2024-11-27 06:17:22.825213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.300 [2024-11-27 06:17:22.825243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.300 [2024-11-27 06:17:22.825359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.300 [2024-11-27 06:17:22.825394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.559 #32 NEW cov: 11892 ft: 14298 corp: 22/1567b lim: 85 exec/s: 32 rss: 70Mb L: 38/85 MS: 1 ChangeBit- 00:07:53.560 [2024-11-27 06:17:22.865804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.560 [2024-11-27 06:17:22.865834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:22.865934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.560 [2024-11-27 06:17:22.865954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:22.866082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.560 [2024-11-27 06:17:22.866105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:22.866231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.560 [2024-11-27 06:17:22.866254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.560 #33 NEW cov: 11892 ft: 14302 corp: 23/1650b lim: 85 exec/s: 33 rss: 70Mb L: 83/85 MS: 1 ChangeBit- 00:07:53.560 [2024-11-27 06:17:22.905508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.560 [2024-11-27 06:17:22.905534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:22.905659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.560 [2024-11-27 06:17:22.905684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.560 #34 NEW cov: 11892 ft: 14317 corp: 24/1688b lim: 85 exec/s: 34 rss: 70Mb L: 38/85 MS: 1 CrossOver- 00:07:53.560 [2024-11-27 06:17:22.946331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.560 [2024-11-27 06:17:22.946362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:22.946431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.560 [2024-11-27 06:17:22.946453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:22.946570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.560 [2024-11-27 06:17:22.946594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:22.946722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.560 [2024-11-27 06:17:22.946744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:22.946858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:53.560 [2024-11-27 06:17:22.946881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:53.560 #35 NEW cov: 11892 ft: 14381 corp: 25/1773b lim: 85 exec/s: 35 rss: 70Mb L: 85/85 MS: 1 ChangeBit- 00:07:53.560 [2024-11-27 06:17:22.996293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.560 [2024-11-27 06:17:22.996325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:22.996415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.560 [2024-11-27 06:17:22.996440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:22.996552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.560 [2024-11-27 06:17:22.996572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:22.996692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.560 [2024-11-27 06:17:22.996713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.560 #36 NEW cov: 11892 ft: 14389 corp: 26/1851b lim: 85 exec/s: 36 rss: 70Mb L: 78/85 MS: 1 ChangeBit- 00:07:53.560 [2024-11-27 06:17:23.036367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.560 [2024-11-27 06:17:23.036397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:23.036467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.560 [2024-11-27 06:17:23.036487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:23.036604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.560 [2024-11-27 06:17:23.036622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:23.036743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.560 [2024-11-27 06:17:23.036766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.560 #37 NEW cov: 11892 ft: 14397 corp: 27/1932b lim: 85 exec/s: 37 rss: 70Mb L: 81/85 MS: 1 CopyPart- 00:07:53.560 [2024-11-27 06:17:23.076500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.560 [2024-11-27 06:17:23.076538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:23.076638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.560 [2024-11-27 06:17:23.076660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:23.076775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.560 [2024-11-27 06:17:23.076799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.560 [2024-11-27 06:17:23.076914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.560 [2024-11-27 06:17:23.076934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.820 #38 NEW cov: 11892 ft: 14404 corp: 28/2010b lim: 85 exec/s: 38 rss: 70Mb L: 78/85 MS: 1 InsertByte- 00:07:53.820 [2024-11-27 06:17:23.126178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.820 [2024-11-27 06:17:23.126210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.820 [2024-11-27 06:17:23.126338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.820 [2024-11-27 06:17:23.126361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.820 #39 NEW cov: 11892 ft: 14408 corp: 29/2049b lim: 85 exec/s: 39 rss: 70Mb L: 39/85 MS: 1 InsertByte- 00:07:53.820 [2024-11-27 06:17:23.166579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.820 [2024-11-27 06:17:23.166619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.820 [2024-11-27 06:17:23.166709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.820 [2024-11-27 06:17:23.166729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.820 [2024-11-27 06:17:23.166849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.820 [2024-11-27 06:17:23.166871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.820 #40 NEW cov: 11892 ft: 14420 corp: 30/2106b lim: 85 exec/s: 40 rss: 70Mb L: 57/85 MS: 1 CrossOver- 00:07:53.820 [2024-11-27 06:17:23.206870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.821 [2024-11-27 06:17:23.206900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.206971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.821 [2024-11-27 06:17:23.206994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.207111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.821 [2024-11-27 06:17:23.207137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.207259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.821 [2024-11-27 06:17:23.207284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.821 #41 NEW cov: 11892 ft: 14457 corp: 31/2189b lim: 85 exec/s: 41 rss: 70Mb L: 83/85 MS: 1 CopyPart- 00:07:53.821 [2024-11-27 06:17:23.257103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.821 [2024-11-27 06:17:23.257137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.257230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.821 [2024-11-27 06:17:23.257252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.257375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.821 [2024-11-27 06:17:23.257398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.257516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.821 [2024-11-27 06:17:23.257539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.821 #42 NEW cov: 11892 ft: 14467 corp: 32/2265b lim: 85 exec/s: 42 rss: 70Mb L: 76/85 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:53.821 [2024-11-27 06:17:23.297251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.821 [2024-11-27 06:17:23.297282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.297402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.821 [2024-11-27 06:17:23.297426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.297540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.821 [2024-11-27 06:17:23.297560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.297688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.821 [2024-11-27 06:17:23.297714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.821 #43 NEW cov: 11892 ft: 14490 corp: 33/2349b lim: 85 exec/s: 43 rss: 70Mb L: 84/85 MS: 1 InsertByte- 00:07:53.821 [2024-11-27 06:17:23.347633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:53.821 [2024-11-27 06:17:23.347666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.347753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:53.821 [2024-11-27 06:17:23.347772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.347883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:53.821 [2024-11-27 06:17:23.347908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.348022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:53.821 [2024-11-27 06:17:23.348045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.821 [2024-11-27 06:17:23.348165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:53.821 [2024-11-27 06:17:23.348185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:54.080 #44 NEW cov: 11892 ft: 14507 corp: 34/2434b lim: 85 exec/s: 44 rss: 70Mb L: 85/85 MS: 1 PersAutoDict- DE: "\002\000"- 00:07:54.080 [2024-11-27 06:17:23.387459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:54.080 [2024-11-27 06:17:23.387489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.080 [2024-11-27 06:17:23.387576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:54.080 [2024-11-27 06:17:23.387604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.080 [2024-11-27 06:17:23.387733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:54.080 [2024-11-27 06:17:23.387753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.080 [2024-11-27 06:17:23.387870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:54.080 [2024-11-27 06:17:23.387890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.080 #45 NEW cov: 11892 ft: 14519 corp: 35/2507b lim: 85 exec/s: 45 rss: 70Mb L: 73/85 MS: 1 EraseBytes- 00:07:54.080 [2024-11-27 06:17:23.427348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:54.080 [2024-11-27 06:17:23.427382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.080 [2024-11-27 06:17:23.427474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:54.080 [2024-11-27 06:17:23.427494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.080 [2024-11-27 06:17:23.427619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:54.080 [2024-11-27 06:17:23.427641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.080 #46 NEW cov: 11892 ft: 14556 corp: 36/2560b lim: 85 exec/s: 46 rss: 70Mb L: 53/85 MS: 1 ChangeBit- 00:07:54.080 [2024-11-27 06:17:23.477707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:54.080 [2024-11-27 06:17:23.477737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.081 [2024-11-27 06:17:23.477814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:54.081 [2024-11-27 06:17:23.477835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.081 [2024-11-27 06:17:23.477946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:54.081 [2024-11-27 06:17:23.477969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.081 [2024-11-27 06:17:23.478083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:54.081 [2024-11-27 06:17:23.478107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.081 #50 NEW cov: 11892 ft: 14563 corp: 37/2642b lim: 85 exec/s: 50 rss: 70Mb L: 82/85 MS: 4 CrossOver-EraseBytes-ChangeBinInt-InsertRepeatedBytes- 00:07:54.081 [2024-11-27 06:17:23.527982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:54.081 [2024-11-27 06:17:23.528015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.081 [2024-11-27 06:17:23.528124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:54.081 [2024-11-27 06:17:23.528146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.081 [2024-11-27 06:17:23.528259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:54.081 [2024-11-27 06:17:23.528277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.081 [2024-11-27 06:17:23.528395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:54.081 [2024-11-27 06:17:23.528420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.081 #51 NEW cov: 11892 ft: 14569 corp: 38/2715b lim: 85 exec/s: 51 rss: 70Mb L: 73/85 MS: 1 ChangeBinInt- 00:07:54.081 [2024-11-27 06:17:23.577703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:54.081 [2024-11-27 06:17:23.577735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.081 [2024-11-27 06:17:23.577833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:54.081 [2024-11-27 06:17:23.577853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.081 [2024-11-27 06:17:23.577970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:54.081 [2024-11-27 06:17:23.577992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.081 #52 NEW cov: 11892 ft: 14580 corp: 39/2768b lim: 85 exec/s: 52 rss: 70Mb L: 53/85 MS: 1 CopyPart- 00:07:54.341 [2024-11-27 06:17:23.628114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:54.341 [2024-11-27 06:17:23.628145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.341 [2024-11-27 06:17:23.628214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:54.341 [2024-11-27 06:17:23.628235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.341 [2024-11-27 06:17:23.628352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:54.341 [2024-11-27 06:17:23.628374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.341 [2024-11-27 06:17:23.628488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:54.341 [2024-11-27 06:17:23.628510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.341 #53 NEW cov: 11892 ft: 14603 corp: 40/2844b lim: 85 exec/s: 53 rss: 70Mb L: 76/85 MS: 1 InsertRepeatedBytes- 00:07:54.341 [2024-11-27 06:17:23.668526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:54.341 [2024-11-27 06:17:23.668554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.341 [2024-11-27 06:17:23.668642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:54.341 [2024-11-27 06:17:23.668664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.341 [2024-11-27 06:17:23.668779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:54.342 [2024-11-27 06:17:23.668801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.342 [2024-11-27 06:17:23.668916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:54.342 [2024-11-27 06:17:23.668935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.342 [2024-11-27 06:17:23.669043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:54.342 [2024-11-27 06:17:23.669063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:54.342 #54 NEW cov: 11892 ft: 14670 corp: 41/2929b lim: 85 exec/s: 27 rss: 70Mb L: 85/85 MS: 1 ShuffleBytes- 00:07:54.342 #54 DONE cov: 11892 ft: 14670 corp: 41/2929b lim: 85 exec/s: 27 rss: 70Mb 00:07:54.342 ###### Recommended dictionary. ###### 00:07:54.342 "\002\000" # Uses: 2 00:07:54.342 "\001\000\000\000" # Uses: 0 00:07:54.342 ###### End of recommended dictionary. ###### 00:07:54.342 Done 54 runs in 2 second(s) 00:07:54.342 06:17:23 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:07:54.342 06:17:23 -- ../common.sh@72 -- # (( i++ )) 00:07:54.342 06:17:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.342 06:17:23 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:07:54.342 06:17:23 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:07:54.342 06:17:23 -- nvmf/run.sh@24 -- # local timen=1 00:07:54.342 06:17:23 -- nvmf/run.sh@25 -- # local core=0x1 00:07:54.342 06:17:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:54.342 06:17:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:07:54.342 06:17:23 -- nvmf/run.sh@29 -- # printf %02d 23 00:07:54.342 06:17:23 -- nvmf/run.sh@29 -- # port=4423 00:07:54.342 06:17:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:54.342 06:17:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:07:54.342 06:17:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:54.342 06:17:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:07:54.342 [2024-11-27 06:17:23.845285] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:54.342 [2024-11-27 06:17:23.845349] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid39324 ] 00:07:54.601 EAL: No free 2048 kB hugepages reported on node 1 00:07:54.601 [2024-11-27 06:17:24.027024] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.601 [2024-11-27 06:17:24.090658] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:54.601 [2024-11-27 06:17:24.090786] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.861 [2024-11-27 06:17:24.148532] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:54.861 [2024-11-27 06:17:24.164872] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:07:54.861 INFO: Running with entropic power schedule (0xFF, 100). 00:07:54.861 INFO: Seed: 4096580733 00:07:54.861 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:54.861 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:54.861 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:54.861 INFO: A corpus is not provided, starting from an empty corpus 00:07:54.861 #2 INITED exec/s: 0 rss: 60Mb 00:07:54.861 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:54.861 This may also happen if the target rejected all inputs we tried so far 00:07:54.861 [2024-11-27 06:17:24.220087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:54.861 [2024-11-27 06:17:24.220117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.861 [2024-11-27 06:17:24.220173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:54.861 [2024-11-27 06:17:24.220188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.121 NEW_FUNC[1/671]: 0x465118 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:07:55.121 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:55.121 #6 NEW cov: 11598 ft: 11585 corp: 2/13b lim: 25 exec/s: 0 rss: 68Mb L: 12/12 MS: 4 InsertByte-CopyPart-InsertByte-CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:55.121 [2024-11-27 06:17:24.520928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.121 [2024-11-27 06:17:24.520963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.121 [2024-11-27 06:17:24.521007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.121 [2024-11-27 06:17:24.521022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.121 [2024-11-27 06:17:24.521080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.121 [2024-11-27 06:17:24.521095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.121 #17 NEW cov: 11711 ft: 12422 corp: 3/31b lim: 25 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 CopyPart- 00:07:55.121 [2024-11-27 06:17:24.570689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.121 [2024-11-27 06:17:24.570718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.121 #23 NEW cov: 11717 ft: 13127 corp: 4/40b lim: 25 exec/s: 0 rss: 68Mb L: 9/18 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:55.121 [2024-11-27 06:17:24.610937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.121 [2024-11-27 06:17:24.610964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.121 [2024-11-27 06:17:24.610999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.121 [2024-11-27 06:17:24.611015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.121 #24 NEW cov: 11802 ft: 13415 corp: 5/52b lim: 25 exec/s: 0 rss: 68Mb L: 12/18 MS: 1 ChangeByte- 00:07:55.121 [2024-11-27 06:17:24.651210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.121 [2024-11-27 06:17:24.651239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.121 [2024-11-27 06:17:24.651278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.121 [2024-11-27 06:17:24.651292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.121 [2024-11-27 06:17:24.651347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.121 [2024-11-27 06:17:24.651363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.381 #25 NEW cov: 11802 ft: 13493 corp: 6/70b lim: 25 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 CrossOver- 00:07:55.381 [2024-11-27 06:17:24.691064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.381 [2024-11-27 06:17:24.691092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.381 #26 NEW cov: 11802 ft: 13538 corp: 7/76b lim: 25 exec/s: 0 rss: 68Mb L: 6/18 MS: 1 InsertRepeatedBytes- 00:07:55.381 [2024-11-27 06:17:24.731439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.381 [2024-11-27 06:17:24.731470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.381 [2024-11-27 06:17:24.731505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.381 [2024-11-27 06:17:24.731521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.381 [2024-11-27 06:17:24.731576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.381 [2024-11-27 06:17:24.731591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.381 #27 NEW cov: 11802 ft: 13655 corp: 8/95b lim: 25 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 InsertByte- 00:07:55.381 [2024-11-27 06:17:24.771665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.381 [2024-11-27 06:17:24.771693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.381 [2024-11-27 06:17:24.771730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.381 [2024-11-27 06:17:24.771744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.381 [2024-11-27 06:17:24.771797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.381 [2024-11-27 06:17:24.771813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.381 [2024-11-27 06:17:24.771884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:55.381 [2024-11-27 06:17:24.771900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.381 #28 NEW cov: 11802 ft: 14154 corp: 9/115b lim: 25 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 InsertByte- 00:07:55.381 [2024-11-27 06:17:24.811800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.381 [2024-11-27 06:17:24.811828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.381 [2024-11-27 06:17:24.811883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.381 [2024-11-27 06:17:24.811896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.381 [2024-11-27 06:17:24.811951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.381 [2024-11-27 06:17:24.811966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.381 [2024-11-27 06:17:24.812020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:55.381 [2024-11-27 06:17:24.812036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.381 #29 NEW cov: 11802 ft: 14210 corp: 10/138b lim: 25 exec/s: 0 rss: 69Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:07:55.381 [2024-11-27 06:17:24.851492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.381 [2024-11-27 06:17:24.851518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.381 #30 NEW cov: 11802 ft: 14298 corp: 11/143b lim: 25 exec/s: 0 rss: 69Mb L: 5/23 MS: 1 CrossOver- 00:07:55.381 [2024-11-27 06:17:24.891624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.381 [2024-11-27 06:17:24.891652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.641 #36 NEW cov: 11802 ft: 14341 corp: 12/152b lim: 25 exec/s: 0 rss: 69Mb L: 9/23 MS: 1 ChangeBit- 00:07:55.641 [2024-11-27 06:17:24.932013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.641 [2024-11-27 06:17:24.932040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.641 [2024-11-27 06:17:24.932076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.641 [2024-11-27 06:17:24.932090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.641 [2024-11-27 06:17:24.932145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.641 [2024-11-27 06:17:24.932160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.641 #37 NEW cov: 11802 ft: 14380 corp: 13/171b lim: 25 exec/s: 0 rss: 69Mb L: 19/23 MS: 1 InsertByte- 00:07:55.641 [2024-11-27 06:17:24.972211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.641 [2024-11-27 06:17:24.972238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.641 [2024-11-27 06:17:24.972279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.641 [2024-11-27 06:17:24.972295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.641 [2024-11-27 06:17:24.972349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.641 [2024-11-27 06:17:24.972365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.641 [2024-11-27 06:17:24.972420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:55.641 [2024-11-27 06:17:24.972435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.641 #38 NEW cov: 11802 ft: 14488 corp: 14/192b lim: 25 exec/s: 0 rss: 69Mb L: 21/23 MS: 1 InsertRepeatedBytes- 00:07:55.641 [2024-11-27 06:17:25.012408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.641 [2024-11-27 06:17:25.012437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.641 [2024-11-27 06:17:25.012473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.641 [2024-11-27 06:17:25.012489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.641 [2024-11-27 06:17:25.012545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.641 [2024-11-27 06:17:25.012560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.641 [2024-11-27 06:17:25.012614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:55.641 [2024-11-27 06:17:25.012631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.641 #39 NEW cov: 11802 ft: 14529 corp: 15/213b lim: 25 exec/s: 0 rss: 69Mb L: 21/23 MS: 1 InsertRepeatedBytes- 00:07:55.641 [2024-11-27 06:17:25.052214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.641 [2024-11-27 06:17:25.052242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.641 [2024-11-27 06:17:25.052283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.641 [2024-11-27 06:17:25.052303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.641 #40 NEW cov: 11802 ft: 14554 corp: 16/226b lim: 25 exec/s: 0 rss: 69Mb L: 13/23 MS: 1 EraseBytes- 00:07:55.641 [2024-11-27 06:17:25.092241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.642 [2024-11-27 06:17:25.092268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.642 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:55.642 #41 NEW cov: 11825 ft: 14622 corp: 17/235b lim: 25 exec/s: 0 rss: 69Mb L: 9/23 MS: 1 CopyPart- 00:07:55.642 [2024-11-27 06:17:25.132437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.642 [2024-11-27 06:17:25.132464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.642 [2024-11-27 06:17:25.132517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.642 [2024-11-27 06:17:25.132531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.642 #42 NEW cov: 11825 ft: 14639 corp: 18/247b lim: 25 exec/s: 0 rss: 69Mb L: 12/23 MS: 1 ChangeBinInt- 00:07:55.642 [2024-11-27 06:17:25.172805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.642 [2024-11-27 06:17:25.172833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.642 [2024-11-27 06:17:25.172883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.642 [2024-11-27 06:17:25.172899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.642 [2024-11-27 06:17:25.172956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.642 [2024-11-27 06:17:25.172971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.642 [2024-11-27 06:17:25.173025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:55.642 [2024-11-27 06:17:25.173039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.901 #43 NEW cov: 11825 ft: 14651 corp: 19/269b lim: 25 exec/s: 43 rss: 69Mb L: 22/23 MS: 1 InsertByte- 00:07:55.901 [2024-11-27 06:17:25.212809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.901 [2024-11-27 06:17:25.212836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.901 [2024-11-27 06:17:25.212889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.901 [2024-11-27 06:17:25.212905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.901 [2024-11-27 06:17:25.212960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.901 [2024-11-27 06:17:25.212975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.901 #44 NEW cov: 11825 ft: 14666 corp: 20/288b lim: 25 exec/s: 44 rss: 69Mb L: 19/23 MS: 1 ShuffleBytes- 00:07:55.901 [2024-11-27 06:17:25.252764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.901 [2024-11-27 06:17:25.252791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.901 [2024-11-27 06:17:25.252827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.901 [2024-11-27 06:17:25.252846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.901 #45 NEW cov: 11825 ft: 14676 corp: 21/301b lim: 25 exec/s: 45 rss: 69Mb L: 13/23 MS: 1 CrossOver- 00:07:55.901 [2024-11-27 06:17:25.282768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.901 [2024-11-27 06:17:25.282795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.901 #46 NEW cov: 11825 ft: 14683 corp: 22/306b lim: 25 exec/s: 46 rss: 69Mb L: 5/23 MS: 1 ChangeBinInt- 00:07:55.901 [2024-11-27 06:17:25.323160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.901 [2024-11-27 06:17:25.323186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.901 [2024-11-27 06:17:25.323236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.901 [2024-11-27 06:17:25.323252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.901 [2024-11-27 06:17:25.323308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.901 [2024-11-27 06:17:25.323340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.901 #47 NEW cov: 11825 ft: 14712 corp: 23/324b lim: 25 exec/s: 47 rss: 69Mb L: 18/23 MS: 1 CrossOver- 00:07:55.901 [2024-11-27 06:17:25.363205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.901 [2024-11-27 06:17:25.363231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.901 [2024-11-27 06:17:25.363275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.901 [2024-11-27 06:17:25.363290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.901 [2024-11-27 06:17:25.363347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.901 [2024-11-27 06:17:25.363362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.901 #48 NEW cov: 11825 ft: 14717 corp: 24/342b lim: 25 exec/s: 48 rss: 69Mb L: 18/23 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:55.901 [2024-11-27 06:17:25.403442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:55.901 [2024-11-27 06:17:25.403469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.901 [2024-11-27 06:17:25.403513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:55.901 [2024-11-27 06:17:25.403528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.901 [2024-11-27 06:17:25.403581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:55.901 [2024-11-27 06:17:25.403596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.901 [2024-11-27 06:17:25.403641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:55.901 [2024-11-27 06:17:25.403656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.901 #49 NEW cov: 11825 ft: 14765 corp: 25/364b lim: 25 exec/s: 49 rss: 69Mb L: 22/23 MS: 1 InsertByte- 00:07:56.162 [2024-11-27 06:17:25.443500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.162 [2024-11-27 06:17:25.443532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.162 [2024-11-27 06:17:25.443569] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.162 [2024-11-27 06:17:25.443584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.162 [2024-11-27 06:17:25.443696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:56.162 [2024-11-27 06:17:25.443712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.162 #50 NEW cov: 11825 ft: 14835 corp: 26/383b lim: 25 exec/s: 50 rss: 69Mb L: 19/23 MS: 1 ChangeBit- 00:07:56.162 [2024-11-27 06:17:25.483367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.162 [2024-11-27 06:17:25.483394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.162 #51 NEW cov: 11825 ft: 14840 corp: 27/392b lim: 25 exec/s: 51 rss: 69Mb L: 9/23 MS: 1 CMP- DE: "\376\377\377\377"- 00:07:56.162 [2024-11-27 06:17:25.523495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.162 [2024-11-27 06:17:25.523522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.162 #52 NEW cov: 11825 ft: 14846 corp: 28/397b lim: 25 exec/s: 52 rss: 70Mb L: 5/23 MS: 1 PersAutoDict- DE: "\376\377\377\377"- 00:07:56.162 [2024-11-27 06:17:25.563580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.162 [2024-11-27 06:17:25.563611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.162 #53 NEW cov: 11825 ft: 14866 corp: 29/404b lim: 25 exec/s: 53 rss: 70Mb L: 7/23 MS: 1 CrossOver- 00:07:56.162 [2024-11-27 06:17:25.603722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.162 [2024-11-27 06:17:25.603748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.162 #54 NEW cov: 11825 ft: 14886 corp: 30/413b lim: 25 exec/s: 54 rss: 70Mb L: 9/23 MS: 1 CMP- DE: "\000\010"- 00:07:56.162 [2024-11-27 06:17:25.643941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.162 [2024-11-27 06:17:25.643969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.162 [2024-11-27 06:17:25.644004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.162 [2024-11-27 06:17:25.644020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.162 #55 NEW cov: 11825 ft: 14921 corp: 31/425b lim: 25 exec/s: 55 rss: 70Mb L: 12/23 MS: 1 ChangeBinInt- 00:07:56.162 [2024-11-27 06:17:25.684347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.162 [2024-11-27 06:17:25.684375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.162 [2024-11-27 06:17:25.684433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.162 [2024-11-27 06:17:25.684449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.162 [2024-11-27 06:17:25.684506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:56.162 [2024-11-27 06:17:25.684522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.162 [2024-11-27 06:17:25.684585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:56.162 [2024-11-27 06:17:25.684605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.422 #56 NEW cov: 11825 ft: 14935 corp: 32/449b lim: 25 exec/s: 56 rss: 70Mb L: 24/24 MS: 1 CrossOver- 00:07:56.422 [2024-11-27 06:17:25.724434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.422 [2024-11-27 06:17:25.724461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.422 [2024-11-27 06:17:25.724510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.422 [2024-11-27 06:17:25.724525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.422 [2024-11-27 06:17:25.724582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:56.422 [2024-11-27 06:17:25.724602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.422 [2024-11-27 06:17:25.724675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:56.422 [2024-11-27 06:17:25.724691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.422 #57 NEW cov: 11825 ft: 14947 corp: 33/469b lim: 25 exec/s: 57 rss: 70Mb L: 20/24 MS: 1 CMP- DE: "\001\222-\232\337\247,\334"- 00:07:56.422 [2024-11-27 06:17:25.764432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.422 [2024-11-27 06:17:25.764459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.422 [2024-11-27 06:17:25.764503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.422 [2024-11-27 06:17:25.764519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.422 [2024-11-27 06:17:25.764575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:56.422 [2024-11-27 06:17:25.764589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.422 #58 NEW cov: 11825 ft: 14971 corp: 34/488b lim: 25 exec/s: 58 rss: 70Mb L: 19/24 MS: 1 EraseBytes- 00:07:56.422 [2024-11-27 06:17:25.804419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.422 [2024-11-27 06:17:25.804445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.422 [2024-11-27 06:17:25.804493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.422 [2024-11-27 06:17:25.804509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.422 #59 NEW cov: 11825 ft: 14977 corp: 35/501b lim: 25 exec/s: 59 rss: 70Mb L: 13/24 MS: 1 ChangeBinInt- 00:07:56.422 [2024-11-27 06:17:25.844712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.422 [2024-11-27 06:17:25.844739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.422 [2024-11-27 06:17:25.844777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.422 [2024-11-27 06:17:25.844792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.422 [2024-11-27 06:17:25.844848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:56.422 [2024-11-27 06:17:25.844867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.423 #60 NEW cov: 11825 ft: 15037 corp: 36/519b lim: 25 exec/s: 60 rss: 70Mb L: 18/24 MS: 1 ChangeByte- 00:07:56.423 [2024-11-27 06:17:25.884966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.423 [2024-11-27 06:17:25.884993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.423 [2024-11-27 06:17:25.885065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.423 [2024-11-27 06:17:25.885080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.423 [2024-11-27 06:17:25.885138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:56.423 [2024-11-27 06:17:25.885155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.423 [2024-11-27 06:17:25.885209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:56.423 [2024-11-27 06:17:25.885224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.423 [2024-11-27 06:17:25.885282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:07:56.423 [2024-11-27 06:17:25.885296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:56.423 #61 NEW cov: 11825 ft: 15117 corp: 37/544b lim: 25 exec/s: 61 rss: 70Mb L: 25/25 MS: 1 CopyPart- 00:07:56.423 [2024-11-27 06:17:25.924647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.423 [2024-11-27 06:17:25.924674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.423 #63 NEW cov: 11825 ft: 15141 corp: 38/552b lim: 25 exec/s: 63 rss: 70Mb L: 8/25 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:56.682 [2024-11-27 06:17:25.964785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.682 [2024-11-27 06:17:25.964811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.682 #64 NEW cov: 11825 ft: 15147 corp: 39/561b lim: 25 exec/s: 64 rss: 70Mb L: 9/25 MS: 1 ChangeByte- 00:07:56.682 [2024-11-27 06:17:26.004995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.682 [2024-11-27 06:17:26.005022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.682 [2024-11-27 06:17:26.005076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.682 [2024-11-27 06:17:26.005091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.682 #65 NEW cov: 11825 ft: 15153 corp: 40/573b lim: 25 exec/s: 65 rss: 70Mb L: 12/25 MS: 1 PersAutoDict- DE: "\001\222-\232\337\247,\334"- 00:07:56.682 [2024-11-27 06:17:26.045323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.682 [2024-11-27 06:17:26.045351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.682 [2024-11-27 06:17:26.045410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.682 [2024-11-27 06:17:26.045426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.682 [2024-11-27 06:17:26.045481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:56.682 [2024-11-27 06:17:26.045496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.683 [2024-11-27 06:17:26.045555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:56.683 [2024-11-27 06:17:26.045571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.683 #66 NEW cov: 11825 ft: 15190 corp: 41/593b lim: 25 exec/s: 66 rss: 70Mb L: 20/25 MS: 1 InsertByte- 00:07:56.683 [2024-11-27 06:17:26.085105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.683 [2024-11-27 06:17:26.085132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.683 #67 NEW cov: 11825 ft: 15192 corp: 42/602b lim: 25 exec/s: 67 rss: 70Mb L: 9/25 MS: 1 ChangeByte- 00:07:56.683 [2024-11-27 06:17:26.125304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.683 [2024-11-27 06:17:26.125331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.683 [2024-11-27 06:17:26.125368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.683 [2024-11-27 06:17:26.125382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.683 #68 NEW cov: 11825 ft: 15215 corp: 43/615b lim: 25 exec/s: 68 rss: 70Mb L: 13/25 MS: 1 ShuffleBytes- 00:07:56.683 [2024-11-27 06:17:26.165426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.683 [2024-11-27 06:17:26.165454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.683 [2024-11-27 06:17:26.165503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.683 [2024-11-27 06:17:26.165521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.683 #69 NEW cov: 11825 ft: 15218 corp: 44/625b lim: 25 exec/s: 69 rss: 70Mb L: 10/25 MS: 1 EraseBytes- 00:07:56.683 [2024-11-27 06:17:26.205604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:56.683 [2024-11-27 06:17:26.205633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.683 [2024-11-27 06:17:26.205683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:56.683 [2024-11-27 06:17:26.205698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.943 #70 NEW cov: 11825 ft: 15222 corp: 45/638b lim: 25 exec/s: 35 rss: 70Mb L: 13/25 MS: 1 CopyPart- 00:07:56.943 #70 DONE cov: 11825 ft: 15222 corp: 45/638b lim: 25 exec/s: 35 rss: 70Mb 00:07:56.943 ###### Recommended dictionary. ###### 00:07:56.943 "\377\377\377\377\377\377\377\377" # Uses: 2 00:07:56.943 "\376\377\377\377" # Uses: 1 00:07:56.943 "\000\010" # Uses: 0 00:07:56.943 "\001\222-\232\337\247,\334" # Uses: 1 00:07:56.943 ###### End of recommended dictionary. ###### 00:07:56.943 Done 70 runs in 2 second(s) 00:07:56.943 06:17:26 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:07:56.943 06:17:26 -- ../common.sh@72 -- # (( i++ )) 00:07:56.943 06:17:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:56.943 06:17:26 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:07:56.943 06:17:26 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:07:56.943 06:17:26 -- nvmf/run.sh@24 -- # local timen=1 00:07:56.943 06:17:26 -- nvmf/run.sh@25 -- # local core=0x1 00:07:56.943 06:17:26 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:56.943 06:17:26 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:07:56.943 06:17:26 -- nvmf/run.sh@29 -- # printf %02d 24 00:07:56.943 06:17:26 -- nvmf/run.sh@29 -- # port=4424 00:07:56.943 06:17:26 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:56.943 06:17:26 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:07:56.943 06:17:26 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:56.943 06:17:26 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:07:56.943 [2024-11-27 06:17:26.394017] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:56.943 [2024-11-27 06:17:26.394099] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid39706 ] 00:07:56.943 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.203 [2024-11-27 06:17:26.575973] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.203 [2024-11-27 06:17:26.640331] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:57.203 [2024-11-27 06:17:26.640475] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.203 [2024-11-27 06:17:26.698513] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.203 [2024-11-27 06:17:26.714900] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:07:57.203 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.203 INFO: Seed: 2350607928 00:07:57.462 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:57.462 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:57.462 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:57.462 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.462 #2 INITED exec/s: 0 rss: 60Mb 00:07:57.462 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:57.462 This may also happen if the target rejected all inputs we tried so far 00:07:57.462 [2024-11-27 06:17:26.770085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458472809492 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.462 [2024-11-27 06:17:26.770116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.462 [2024-11-27 06:17:26.770172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.462 [2024-11-27 06:17:26.770188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.720 NEW_FUNC[1/672]: 0x466208 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:07:57.720 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:57.720 #15 NEW cov: 11670 ft: 11664 corp: 2/55b lim: 100 exec/s: 0 rss: 68Mb L: 54/54 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:07:57.720 [2024-11-27 06:17:27.091113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.721 [2024-11-27 06:17:27.091170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.721 [2024-11-27 06:17:27.091256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.721 [2024-11-27 06:17:27.091285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.721 [2024-11-27 06:17:27.091361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.721 [2024-11-27 06:17:27.091388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.721 #19 NEW cov: 11783 ft: 12642 corp: 3/129b lim: 100 exec/s: 0 rss: 68Mb L: 74/74 MS: 4 CopyPart-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:57.721 [2024-11-27 06:17:27.140862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458472809492 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.721 [2024-11-27 06:17:27.140889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.721 [2024-11-27 06:17:27.140925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.721 [2024-11-27 06:17:27.140940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.721 #20 NEW cov: 11789 ft: 12939 corp: 4/183b lim: 100 exec/s: 0 rss: 68Mb L: 54/74 MS: 1 ShuffleBytes- 00:07:57.721 [2024-11-27 06:17:27.180965] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458226878996 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.721 [2024-11-27 06:17:27.180992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.721 [2024-11-27 06:17:27.181034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.721 [2024-11-27 06:17:27.181048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.721 #25 NEW cov: 11874 ft: 13224 corp: 5/239b lim: 100 exec/s: 0 rss: 68Mb L: 56/74 MS: 5 ChangeByte-ChangeBit-CopyPart-ShuffleBytes-CrossOver- 00:07:57.721 [2024-11-27 06:17:27.221417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9910603676685797769 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.721 [2024-11-27 06:17:27.221444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.721 [2024-11-27 06:17:27.221480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.721 [2024-11-27 06:17:27.221495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.721 [2024-11-27 06:17:27.221546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.721 [2024-11-27 06:17:27.221561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.721 [2024-11-27 06:17:27.221615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.721 [2024-11-27 06:17:27.221630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.721 #26 NEW cov: 11874 ft: 13703 corp: 6/319b lim: 100 exec/s: 0 rss: 68Mb L: 80/80 MS: 1 InsertRepeatedBytes- 00:07:57.979 [2024-11-27 06:17:27.261215] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458472809492 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.261240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.979 [2024-11-27 06:17:27.261273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456769922068 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.261293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.979 #27 NEW cov: 11874 ft: 13749 corp: 7/373b lim: 100 exec/s: 0 rss: 68Mb L: 54/80 MS: 1 ChangeBit- 00:07:57.979 [2024-11-27 06:17:27.301605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.301632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.979 [2024-11-27 06:17:27.301680] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.301695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.979 [2024-11-27 06:17:27.301745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.301759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.979 [2024-11-27 06:17:27.301808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.301823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.979 #28 NEW cov: 11874 ft: 13822 corp: 8/465b lim: 100 exec/s: 0 rss: 68Mb L: 92/92 MS: 1 CopyPart- 00:07:57.979 [2024-11-27 06:17:27.341740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9910603676685797769 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.341766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.979 [2024-11-27 06:17:27.341814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9910603676518025609 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.341828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.979 [2024-11-27 06:17:27.341876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.341907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.979 [2024-11-27 06:17:27.341959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.341973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.979 #29 NEW cov: 11874 ft: 13881 corp: 9/545b lim: 100 exec/s: 0 rss: 68Mb L: 80/92 MS: 1 ChangeBinInt- 00:07:57.979 [2024-11-27 06:17:27.391567] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458472809492 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.391593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.979 [2024-11-27 06:17:27.391632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.391646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.979 #30 NEW cov: 11874 ft: 13902 corp: 10/599b lim: 100 exec/s: 0 rss: 68Mb L: 54/92 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:57.979 [2024-11-27 06:17:27.432009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9910603676685797769 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.979 [2024-11-27 06:17:27.432038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.979 [2024-11-27 06:17:27.432090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9910603676518025609 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.980 [2024-11-27 06:17:27.432104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.980 [2024-11-27 06:17:27.432153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.980 [2024-11-27 06:17:27.432168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.980 [2024-11-27 06:17:27.432218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.980 [2024-11-27 06:17:27.432233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.980 #31 NEW cov: 11874 ft: 13921 corp: 11/680b lim: 100 exec/s: 0 rss: 69Mb L: 81/92 MS: 1 InsertByte- 00:07:57.980 [2024-11-27 06:17:27.472095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.980 [2024-11-27 06:17:27.472122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.980 [2024-11-27 06:17:27.472168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.980 [2024-11-27 06:17:27.472183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.980 [2024-11-27 06:17:27.472231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.980 [2024-11-27 06:17:27.472244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.980 [2024-11-27 06:17:27.472291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.980 [2024-11-27 06:17:27.472306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.980 #34 NEW cov: 11874 ft: 14009 corp: 12/779b lim: 100 exec/s: 0 rss: 69Mb L: 99/99 MS: 3 ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:57.980 [2024-11-27 06:17:27.512237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.980 [2024-11-27 06:17:27.512263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.980 [2024-11-27 06:17:27.512309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.980 [2024-11-27 06:17:27.512324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.980 [2024-11-27 06:17:27.512371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.980 [2024-11-27 06:17:27.512383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.980 [2024-11-27 06:17:27.512434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.980 [2024-11-27 06:17:27.512450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.238 #35 NEW cov: 11874 ft: 14036 corp: 13/873b lim: 100 exec/s: 0 rss: 69Mb L: 94/99 MS: 1 CopyPart- 00:07:58.238 [2024-11-27 06:17:27.552205] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.552231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.238 [2024-11-27 06:17:27.552265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1241513984 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.552279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.238 [2024-11-27 06:17:27.552329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.552344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.238 #36 NEW cov: 11874 ft: 14050 corp: 14/947b lim: 100 exec/s: 0 rss: 69Mb L: 74/99 MS: 1 ChangeBinInt- 00:07:58.238 [2024-11-27 06:17:27.592152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2095321804814160916 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.592178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.238 [2024-11-27 06:17:27.592223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456769922068 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.592238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.238 #37 NEW cov: 11874 ft: 14115 corp: 15/1001b lim: 100 exec/s: 0 rss: 69Mb L: 54/99 MS: 1 ChangeBinInt- 00:07:58.238 [2024-11-27 06:17:27.632277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458226878996 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.632302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.238 [2024-11-27 06:17:27.632354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.632369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.238 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:58.238 #38 NEW cov: 11897 ft: 14148 corp: 16/1048b lim: 100 exec/s: 0 rss: 69Mb L: 47/99 MS: 1 EraseBytes- 00:07:58.238 [2024-11-27 06:17:27.682430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458226878996 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.682457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.238 [2024-11-27 06:17:27.682494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.682510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.238 #39 NEW cov: 11897 ft: 14168 corp: 17/1095b lim: 100 exec/s: 0 rss: 69Mb L: 47/99 MS: 1 ChangeBit- 00:07:58.238 [2024-11-27 06:17:27.722570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458226878996 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.722608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.238 [2024-11-27 06:17:27.722660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.722675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.238 #40 NEW cov: 11897 ft: 14249 corp: 18/1152b lim: 100 exec/s: 0 rss: 69Mb L: 57/99 MS: 1 InsertByte- 00:07:58.238 [2024-11-27 06:17:27.762972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.762999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.238 [2024-11-27 06:17:27.763045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.763060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.238 [2024-11-27 06:17:27.763108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.763124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.238 [2024-11-27 06:17:27.763172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.238 [2024-11-27 06:17:27.763188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.498 #41 NEW cov: 11897 ft: 14264 corp: 19/1249b lim: 100 exec/s: 41 rss: 69Mb L: 97/99 MS: 1 CopyPart- 00:07:58.498 [2024-11-27 06:17:27.813128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458226878996 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.498 [2024-11-27 06:17:27.813154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.498 [2024-11-27 06:17:27.813217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803370862187540 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.498 [2024-11-27 06:17:27.813232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.498 [2024-11-27 06:17:27.813281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.498 [2024-11-27 06:17:27.813297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.498 [2024-11-27 06:17:27.813344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.498 [2024-11-27 06:17:27.813358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.499 #42 NEW cov: 11897 ft: 14300 corp: 20/1343b lim: 100 exec/s: 42 rss: 69Mb L: 94/99 MS: 1 InsertRepeatedBytes- 00:07:58.499 [2024-11-27 06:17:27.853226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.853252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.499 [2024-11-27 06:17:27.853302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.853317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.499 [2024-11-27 06:17:27.853370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.853385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.499 [2024-11-27 06:17:27.853434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.853448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.499 #43 NEW cov: 11897 ft: 14348 corp: 21/1435b lim: 100 exec/s: 43 rss: 69Mb L: 92/99 MS: 1 CMP- DE: "\017\000\000\000\000\000\000\000"- 00:07:58.499 [2024-11-27 06:17:27.893388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458226878996 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.893414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.499 [2024-11-27 06:17:27.893461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.893476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.499 [2024-11-27 06:17:27.893523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.893539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.499 [2024-11-27 06:17:27.893587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.893605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.499 #54 NEW cov: 11897 ft: 14387 corp: 22/1533b lim: 100 exec/s: 54 rss: 69Mb L: 98/99 MS: 1 InsertRepeatedBytes- 00:07:58.499 [2024-11-27 06:17:27.933356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458472809492 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.933382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.499 [2024-11-27 06:17:27.933433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.933449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.499 [2024-11-27 06:17:27.933498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.933512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.499 #55 NEW cov: 11897 ft: 14399 corp: 23/1595b lim: 100 exec/s: 55 rss: 69Mb L: 62/99 MS: 1 PersAutoDict- DE: "\017\000\000\000\000\000\000\000"- 00:07:58.499 [2024-11-27 06:17:27.973301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2095321804814160916 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.973327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.499 [2024-11-27 06:17:27.973376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456769922068 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:27.973395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.499 #56 NEW cov: 11897 ft: 14411 corp: 24/1649b lim: 100 exec/s: 56 rss: 69Mb L: 54/99 MS: 1 ShuffleBytes- 00:07:58.499 [2024-11-27 06:17:28.013706] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458226878996 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:28.013732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.499 [2024-11-27 06:17:28.013778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:28.013793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.499 [2024-11-27 06:17:28.013842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:28.013856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.499 [2024-11-27 06:17:28.013906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.499 [2024-11-27 06:17:28.013921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.759 #57 NEW cov: 11897 ft: 14413 corp: 25/1745b lim: 100 exec/s: 57 rss: 69Mb L: 96/99 MS: 1 EraseBytes- 00:07:58.759 [2024-11-27 06:17:28.053824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.053851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.759 [2024-11-27 06:17:28.053899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.053913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.759 [2024-11-27 06:17:28.053963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.053978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.759 [2024-11-27 06:17:28.054026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.054040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.759 #58 NEW cov: 11897 ft: 14418 corp: 26/1839b lim: 100 exec/s: 58 rss: 70Mb L: 94/99 MS: 1 ChangeByte- 00:07:58.759 [2024-11-27 06:17:28.093654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458226878996 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.093681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.759 [2024-11-27 06:17:28.093733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.093748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.759 #59 NEW cov: 11897 ft: 14494 corp: 27/1893b lim: 100 exec/s: 59 rss: 70Mb L: 54/99 MS: 1 EraseBytes- 00:07:58.759 [2024-11-27 06:17:28.133927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2095321804814160916 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.133954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.759 [2024-11-27 06:17:28.133992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1445374005754467348 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.134007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.759 [2024-11-27 06:17:28.134057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.134071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.759 #60 NEW cov: 11897 ft: 14503 corp: 28/1955b lim: 100 exec/s: 60 rss: 70Mb L: 62/99 MS: 1 PersAutoDict- DE: "\017\000\000\000\000\000\000\000"- 00:07:58.759 [2024-11-27 06:17:28.173885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458472809492 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.173911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.759 [2024-11-27 06:17:28.173963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.173979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.759 #61 NEW cov: 11897 ft: 14548 corp: 29/2005b lim: 100 exec/s: 61 rss: 70Mb L: 50/99 MS: 1 EraseBytes- 00:07:58.759 [2024-11-27 06:17:28.213984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458226878996 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.214010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.759 [2024-11-27 06:17:28.214071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.214087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.759 #62 NEW cov: 11897 ft: 14561 corp: 30/2053b lim: 100 exec/s: 62 rss: 70Mb L: 48/99 MS: 1 InsertByte- 00:07:58.759 [2024-11-27 06:17:28.254374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.254401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.759 [2024-11-27 06:17:28.254447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.254461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.759 [2024-11-27 06:17:28.254511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.254525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.759 [2024-11-27 06:17:28.254575] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.759 [2024-11-27 06:17:28.254589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.759 #63 NEW cov: 11897 ft: 14576 corp: 31/2147b lim: 100 exec/s: 63 rss: 70Mb L: 94/99 MS: 1 CopyPart- 00:07:59.019 [2024-11-27 06:17:28.294349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.294376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.294413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1241513984 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.294429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.294479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.294494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.019 #64 NEW cov: 11897 ft: 14638 corp: 32/2221b lim: 100 exec/s: 64 rss: 70Mb L: 74/99 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:59.019 [2024-11-27 06:17:28.334287] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458226878996 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.334312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.334352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.334367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.019 #65 NEW cov: 11897 ft: 14646 corp: 33/2277b lim: 100 exec/s: 65 rss: 70Mb L: 56/99 MS: 1 ChangeBit- 00:07:59.019 [2024-11-27 06:17:28.364817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.364843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.364895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.364911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.364961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.364975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.365024] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.365039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.365092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:14540374740453411072 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.365106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:59.019 #66 NEW cov: 11897 ft: 14710 corp: 34/2377b lim: 100 exec/s: 66 rss: 70Mb L: 100/100 MS: 1 InsertByte- 00:07:59.019 [2024-11-27 06:17:28.404619] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458472809492 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.404647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.404684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9877541961169734025 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.404699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.404749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1446803456761533460 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.404763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.019 #67 NEW cov: 11897 ft: 14726 corp: 35/2439b lim: 100 exec/s: 67 rss: 70Mb L: 62/100 MS: 1 CrossOver- 00:07:59.019 [2024-11-27 06:17:28.444493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458226878996 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.444520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.019 #68 NEW cov: 11897 ft: 15554 corp: 36/2467b lim: 100 exec/s: 68 rss: 70Mb L: 28/100 MS: 1 EraseBytes- 00:07:59.019 [2024-11-27 06:17:28.495076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458226878996 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.495103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.495155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803370862187540 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.495172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.495234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.495249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.495301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.495315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.019 #69 NEW cov: 11897 ft: 15557 corp: 37/2561b lim: 100 exec/s: 69 rss: 70Mb L: 94/100 MS: 1 ShuffleBytes- 00:07:59.019 [2024-11-27 06:17:28.535010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458472809492 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.535035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.535072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.535087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.019 [2024-11-27 06:17:28.535139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1446803456761533460 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.019 [2024-11-27 06:17:28.535154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.280 #70 NEW cov: 11897 ft: 15632 corp: 38/2623b lim: 100 exec/s: 70 rss: 70Mb L: 62/100 MS: 1 CopyPart- 00:07:59.280 [2024-11-27 06:17:28.574993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1446803458472809492 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.575019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.575056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.575071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.280 #71 NEW cov: 11897 ft: 15650 corp: 39/2674b lim: 100 exec/s: 71 rss: 70Mb L: 51/100 MS: 1 InsertByte- 00:07:59.280 [2024-11-27 06:17:28.615380] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9910603676685797769 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.615406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.615452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.615467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.615517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.615532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.615581] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.615595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.280 #72 NEW cov: 11897 ft: 15655 corp: 40/2754b lim: 100 exec/s: 72 rss: 70Mb L: 80/100 MS: 1 ChangeByte- 00:07:59.280 [2024-11-27 06:17:28.655536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.655563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.655615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.655630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.655682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.655696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.655748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.655763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.280 #73 NEW cov: 11897 ft: 15658 corp: 41/2836b lim: 100 exec/s: 73 rss: 70Mb L: 82/100 MS: 1 PersAutoDict- DE: "\017\000\000\000\000\000\000\000"- 00:07:59.280 [2024-11-27 06:17:28.695679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.695706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.695742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.695756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.695808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18944 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.695823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.695872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.695887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.280 #74 NEW cov: 11897 ft: 15672 corp: 42/2932b lim: 100 exec/s: 74 rss: 70Mb L: 96/100 MS: 1 CopyPart- 00:07:59.280 [2024-11-27 06:17:28.735869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.735895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.735966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.735982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.736033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.736046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.736097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16492674416640 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.736112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.280 [2024-11-27 06:17:28.736164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.280 [2024-11-27 06:17:28.736180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:59.280 #75 NEW cov: 11897 ft: 15735 corp: 43/3032b lim: 100 exec/s: 37 rss: 70Mb L: 100/100 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:59.280 #75 DONE cov: 11897 ft: 15735 corp: 43/3032b lim: 100 exec/s: 37 rss: 70Mb 00:07:59.280 ###### Recommended dictionary. ###### 00:07:59.280 "\001\000\000\000" # Uses: 2 00:07:59.280 "\017\000\000\000\000\000\000\000" # Uses: 3 00:07:59.280 "\000\000\000\000\000\000\000\000" # Uses: 0 00:07:59.280 ###### End of recommended dictionary. ###### 00:07:59.280 Done 75 runs in 2 second(s) 00:07:59.540 06:17:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:07:59.540 06:17:28 -- ../common.sh@72 -- # (( i++ )) 00:07:59.540 06:17:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.540 06:17:28 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:07:59.540 00:07:59.540 real 1m3.767s 00:07:59.540 user 1m40.778s 00:07:59.540 sys 0m6.738s 00:07:59.540 06:17:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:59.540 06:17:28 -- common/autotest_common.sh@10 -- # set +x 00:07:59.540 ************************************ 00:07:59.540 END TEST nvmf_fuzz 00:07:59.540 ************************************ 00:07:59.540 06:17:28 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:59.540 06:17:28 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:59.540 06:17:28 -- fuzz/llvm.sh@20 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:07:59.540 06:17:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:59.540 06:17:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:59.540 06:17:28 -- common/autotest_common.sh@10 -- # set +x 00:07:59.540 ************************************ 00:07:59.540 START TEST vfio_fuzz 00:07:59.540 ************************************ 00:07:59.540 06:17:28 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:07:59.540 * Looking for test storage... 00:07:59.540 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:59.540 06:17:29 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:59.540 06:17:29 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:59.540 06:17:29 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:59.803 06:17:29 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:59.803 06:17:29 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:59.803 06:17:29 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:59.803 06:17:29 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:59.803 06:17:29 -- scripts/common.sh@335 -- # IFS=.-: 00:07:59.803 06:17:29 -- scripts/common.sh@335 -- # read -ra ver1 00:07:59.803 06:17:29 -- scripts/common.sh@336 -- # IFS=.-: 00:07:59.803 06:17:29 -- scripts/common.sh@336 -- # read -ra ver2 00:07:59.803 06:17:29 -- scripts/common.sh@337 -- # local 'op=<' 00:07:59.803 06:17:29 -- scripts/common.sh@339 -- # ver1_l=2 00:07:59.803 06:17:29 -- scripts/common.sh@340 -- # ver2_l=1 00:07:59.803 06:17:29 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:59.803 06:17:29 -- scripts/common.sh@343 -- # case "$op" in 00:07:59.803 06:17:29 -- scripts/common.sh@344 -- # : 1 00:07:59.803 06:17:29 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:59.803 06:17:29 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:59.803 06:17:29 -- scripts/common.sh@364 -- # decimal 1 00:07:59.803 06:17:29 -- scripts/common.sh@352 -- # local d=1 00:07:59.803 06:17:29 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:59.803 06:17:29 -- scripts/common.sh@354 -- # echo 1 00:07:59.803 06:17:29 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:59.803 06:17:29 -- scripts/common.sh@365 -- # decimal 2 00:07:59.803 06:17:29 -- scripts/common.sh@352 -- # local d=2 00:07:59.803 06:17:29 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:59.803 06:17:29 -- scripts/common.sh@354 -- # echo 2 00:07:59.803 06:17:29 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:59.803 06:17:29 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:59.803 06:17:29 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:59.803 06:17:29 -- scripts/common.sh@367 -- # return 0 00:07:59.803 06:17:29 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:59.803 06:17:29 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:59.803 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.803 --rc genhtml_branch_coverage=1 00:07:59.803 --rc genhtml_function_coverage=1 00:07:59.803 --rc genhtml_legend=1 00:07:59.803 --rc geninfo_all_blocks=1 00:07:59.803 --rc geninfo_unexecuted_blocks=1 00:07:59.803 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.803 ' 00:07:59.803 06:17:29 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:59.803 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.803 --rc genhtml_branch_coverage=1 00:07:59.803 --rc genhtml_function_coverage=1 00:07:59.803 --rc genhtml_legend=1 00:07:59.803 --rc geninfo_all_blocks=1 00:07:59.803 --rc geninfo_unexecuted_blocks=1 00:07:59.803 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.803 ' 00:07:59.803 06:17:29 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:59.803 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.803 --rc genhtml_branch_coverage=1 00:07:59.803 --rc genhtml_function_coverage=1 00:07:59.803 --rc genhtml_legend=1 00:07:59.803 --rc geninfo_all_blocks=1 00:07:59.803 --rc geninfo_unexecuted_blocks=1 00:07:59.803 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.803 ' 00:07:59.803 06:17:29 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:59.803 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.803 --rc genhtml_branch_coverage=1 00:07:59.803 --rc genhtml_function_coverage=1 00:07:59.803 --rc genhtml_legend=1 00:07:59.803 --rc geninfo_all_blocks=1 00:07:59.803 --rc geninfo_unexecuted_blocks=1 00:07:59.803 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.803 ' 00:07:59.803 06:17:29 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:59.803 06:17:29 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:59.803 06:17:29 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:59.803 06:17:29 -- common/autotest_common.sh@34 -- # set -e 00:07:59.803 06:17:29 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:59.803 06:17:29 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:59.803 06:17:29 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:59.803 06:17:29 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:59.803 06:17:29 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:59.803 06:17:29 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:59.803 06:17:29 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:59.803 06:17:29 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:59.803 06:17:29 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:59.803 06:17:29 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:59.803 06:17:29 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:59.803 06:17:29 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:59.803 06:17:29 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:59.803 06:17:29 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:59.803 06:17:29 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:59.803 06:17:29 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:59.803 06:17:29 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:59.803 06:17:29 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:59.803 06:17:29 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:59.803 06:17:29 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:59.803 06:17:29 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:59.803 06:17:29 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:59.804 06:17:29 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:59.804 06:17:29 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:59.804 06:17:29 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:59.804 06:17:29 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:59.804 06:17:29 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:59.804 06:17:29 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:59.804 06:17:29 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:59.804 06:17:29 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:59.804 06:17:29 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:59.804 06:17:29 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:59.804 06:17:29 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:59.804 06:17:29 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:59.804 06:17:29 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:59.804 06:17:29 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:59.804 06:17:29 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:59.804 06:17:29 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:59.804 06:17:29 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:59.804 06:17:29 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:59.804 06:17:29 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:59.804 06:17:29 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:59.804 06:17:29 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:59.804 06:17:29 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:59.804 06:17:29 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:07:59.804 06:17:29 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:59.804 06:17:29 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:59.804 06:17:29 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:59.804 06:17:29 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:59.804 06:17:29 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:59.804 06:17:29 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:59.804 06:17:29 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:59.804 06:17:29 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:59.804 06:17:29 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:59.804 06:17:29 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:59.804 06:17:29 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:59.804 06:17:29 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:59.804 06:17:29 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:59.804 06:17:29 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:59.804 06:17:29 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:59.804 06:17:29 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:59.804 06:17:29 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:59.804 06:17:29 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:59.804 06:17:29 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:59.804 06:17:29 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:07:59.804 06:17:29 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:59.804 06:17:29 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:59.804 06:17:29 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:59.804 06:17:29 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:59.804 06:17:29 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:59.804 06:17:29 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:59.804 06:17:29 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:59.804 06:17:29 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:59.804 06:17:29 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:59.804 06:17:29 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:59.804 06:17:29 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:59.804 06:17:29 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:59.804 06:17:29 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:59.804 06:17:29 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:59.804 06:17:29 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:59.804 06:17:29 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:59.804 06:17:29 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:59.804 06:17:29 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:59.804 06:17:29 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:59.804 06:17:29 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:59.804 06:17:29 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:59.804 06:17:29 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:59.804 06:17:29 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:59.804 06:17:29 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:59.804 06:17:29 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:59.804 06:17:29 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:59.804 06:17:29 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:59.804 06:17:29 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:59.804 06:17:29 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:59.804 06:17:29 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:59.804 06:17:29 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:59.804 06:17:29 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:59.804 06:17:29 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:59.804 06:17:29 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:59.804 #define SPDK_CONFIG_H 00:07:59.804 #define SPDK_CONFIG_APPS 1 00:07:59.804 #define SPDK_CONFIG_ARCH native 00:07:59.804 #undef SPDK_CONFIG_ASAN 00:07:59.804 #undef SPDK_CONFIG_AVAHI 00:07:59.804 #undef SPDK_CONFIG_CET 00:07:59.804 #define SPDK_CONFIG_COVERAGE 1 00:07:59.804 #define SPDK_CONFIG_CROSS_PREFIX 00:07:59.804 #undef SPDK_CONFIG_CRYPTO 00:07:59.804 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:59.804 #undef SPDK_CONFIG_CUSTOMOCF 00:07:59.804 #undef SPDK_CONFIG_DAOS 00:07:59.804 #define SPDK_CONFIG_DAOS_DIR 00:07:59.804 #define SPDK_CONFIG_DEBUG 1 00:07:59.804 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:59.804 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:59.804 #define SPDK_CONFIG_DPDK_INC_DIR 00:07:59.804 #define SPDK_CONFIG_DPDK_LIB_DIR 00:07:59.804 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:59.804 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:59.804 #define SPDK_CONFIG_EXAMPLES 1 00:07:59.804 #undef SPDK_CONFIG_FC 00:07:59.804 #define SPDK_CONFIG_FC_PATH 00:07:59.804 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:59.804 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:59.804 #undef SPDK_CONFIG_FUSE 00:07:59.804 #define SPDK_CONFIG_FUZZER 1 00:07:59.804 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:59.804 #undef SPDK_CONFIG_GOLANG 00:07:59.804 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:59.804 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:59.804 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:59.804 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:59.804 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:59.804 #define SPDK_CONFIG_IDXD 1 00:07:59.804 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:59.804 #undef SPDK_CONFIG_IPSEC_MB 00:07:59.804 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:59.804 #define SPDK_CONFIG_ISAL 1 00:07:59.804 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:59.804 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:59.804 #define SPDK_CONFIG_LIBDIR 00:07:59.804 #undef SPDK_CONFIG_LTO 00:07:59.804 #define SPDK_CONFIG_MAX_LCORES 00:07:59.804 #define SPDK_CONFIG_NVME_CUSE 1 00:07:59.804 #undef SPDK_CONFIG_OCF 00:07:59.804 #define SPDK_CONFIG_OCF_PATH 00:07:59.804 #define SPDK_CONFIG_OPENSSL_PATH 00:07:59.804 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:59.804 #undef SPDK_CONFIG_PGO_USE 00:07:59.804 #define SPDK_CONFIG_PREFIX /usr/local 00:07:59.804 #undef SPDK_CONFIG_RAID5F 00:07:59.804 #undef SPDK_CONFIG_RBD 00:07:59.804 #define SPDK_CONFIG_RDMA 1 00:07:59.804 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:59.804 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:59.804 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:59.804 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:59.804 #undef SPDK_CONFIG_SHARED 00:07:59.804 #undef SPDK_CONFIG_SMA 00:07:59.804 #define SPDK_CONFIG_TESTS 1 00:07:59.804 #undef SPDK_CONFIG_TSAN 00:07:59.804 #define SPDK_CONFIG_UBLK 1 00:07:59.804 #define SPDK_CONFIG_UBSAN 1 00:07:59.804 #undef SPDK_CONFIG_UNIT_TESTS 00:07:59.804 #undef SPDK_CONFIG_URING 00:07:59.804 #define SPDK_CONFIG_URING_PATH 00:07:59.804 #undef SPDK_CONFIG_URING_ZNS 00:07:59.804 #undef SPDK_CONFIG_USDT 00:07:59.804 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:59.804 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:59.804 #define SPDK_CONFIG_VFIO_USER 1 00:07:59.804 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:59.804 #define SPDK_CONFIG_VHOST 1 00:07:59.804 #define SPDK_CONFIG_VIRTIO 1 00:07:59.804 #undef SPDK_CONFIG_VTUNE 00:07:59.804 #define SPDK_CONFIG_VTUNE_DIR 00:07:59.804 #define SPDK_CONFIG_WERROR 1 00:07:59.804 #define SPDK_CONFIG_WPDK_DIR 00:07:59.804 #undef SPDK_CONFIG_XNVME 00:07:59.804 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:59.804 06:17:29 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:59.804 06:17:29 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:59.804 06:17:29 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:59.804 06:17:29 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:59.804 06:17:29 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:59.804 06:17:29 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:59.805 06:17:29 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:59.805 06:17:29 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:59.805 06:17:29 -- paths/export.sh@5 -- # export PATH 00:07:59.805 06:17:29 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:59.805 06:17:29 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:59.805 06:17:29 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:59.805 06:17:29 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:59.805 06:17:29 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:59.805 06:17:29 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:59.805 06:17:29 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:59.805 06:17:29 -- pm/common@16 -- # TEST_TAG=N/A 00:07:59.805 06:17:29 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:59.805 06:17:29 -- common/autotest_common.sh@52 -- # : 1 00:07:59.805 06:17:29 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:59.805 06:17:29 -- common/autotest_common.sh@56 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:59.805 06:17:29 -- common/autotest_common.sh@58 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:59.805 06:17:29 -- common/autotest_common.sh@60 -- # : 1 00:07:59.805 06:17:29 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:59.805 06:17:29 -- common/autotest_common.sh@62 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:59.805 06:17:29 -- common/autotest_common.sh@64 -- # : 00:07:59.805 06:17:29 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:59.805 06:17:29 -- common/autotest_common.sh@66 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:59.805 06:17:29 -- common/autotest_common.sh@68 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:59.805 06:17:29 -- common/autotest_common.sh@70 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:59.805 06:17:29 -- common/autotest_common.sh@72 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:59.805 06:17:29 -- common/autotest_common.sh@74 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:59.805 06:17:29 -- common/autotest_common.sh@76 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:59.805 06:17:29 -- common/autotest_common.sh@78 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:59.805 06:17:29 -- common/autotest_common.sh@80 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:59.805 06:17:29 -- common/autotest_common.sh@82 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:59.805 06:17:29 -- common/autotest_common.sh@84 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:59.805 06:17:29 -- common/autotest_common.sh@86 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:59.805 06:17:29 -- common/autotest_common.sh@88 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:59.805 06:17:29 -- common/autotest_common.sh@90 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:59.805 06:17:29 -- common/autotest_common.sh@92 -- # : 1 00:07:59.805 06:17:29 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:59.805 06:17:29 -- common/autotest_common.sh@94 -- # : 1 00:07:59.805 06:17:29 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:59.805 06:17:29 -- common/autotest_common.sh@96 -- # : rdma 00:07:59.805 06:17:29 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:59.805 06:17:29 -- common/autotest_common.sh@98 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:59.805 06:17:29 -- common/autotest_common.sh@100 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:59.805 06:17:29 -- common/autotest_common.sh@102 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:59.805 06:17:29 -- common/autotest_common.sh@104 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:59.805 06:17:29 -- common/autotest_common.sh@106 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:59.805 06:17:29 -- common/autotest_common.sh@108 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:59.805 06:17:29 -- common/autotest_common.sh@110 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:59.805 06:17:29 -- common/autotest_common.sh@112 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:59.805 06:17:29 -- common/autotest_common.sh@114 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:59.805 06:17:29 -- common/autotest_common.sh@116 -- # : 1 00:07:59.805 06:17:29 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:59.805 06:17:29 -- common/autotest_common.sh@118 -- # : 00:07:59.805 06:17:29 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:59.805 06:17:29 -- common/autotest_common.sh@120 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:59.805 06:17:29 -- common/autotest_common.sh@122 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:59.805 06:17:29 -- common/autotest_common.sh@124 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:59.805 06:17:29 -- common/autotest_common.sh@126 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:59.805 06:17:29 -- common/autotest_common.sh@128 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:59.805 06:17:29 -- common/autotest_common.sh@130 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:59.805 06:17:29 -- common/autotest_common.sh@132 -- # : 00:07:59.805 06:17:29 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:59.805 06:17:29 -- common/autotest_common.sh@134 -- # : true 00:07:59.805 06:17:29 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:59.805 06:17:29 -- common/autotest_common.sh@136 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:59.805 06:17:29 -- common/autotest_common.sh@138 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:59.805 06:17:29 -- common/autotest_common.sh@140 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:59.805 06:17:29 -- common/autotest_common.sh@142 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:59.805 06:17:29 -- common/autotest_common.sh@144 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:59.805 06:17:29 -- common/autotest_common.sh@146 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:59.805 06:17:29 -- common/autotest_common.sh@148 -- # : 00:07:59.805 06:17:29 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:59.805 06:17:29 -- common/autotest_common.sh@150 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:59.805 06:17:29 -- common/autotest_common.sh@152 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:59.805 06:17:29 -- common/autotest_common.sh@154 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:59.805 06:17:29 -- common/autotest_common.sh@156 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:59.805 06:17:29 -- common/autotest_common.sh@158 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:59.805 06:17:29 -- common/autotest_common.sh@160 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:59.805 06:17:29 -- common/autotest_common.sh@163 -- # : 00:07:59.805 06:17:29 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:59.805 06:17:29 -- common/autotest_common.sh@165 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:59.805 06:17:29 -- common/autotest_common.sh@167 -- # : 0 00:07:59.805 06:17:29 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:59.805 06:17:29 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:59.805 06:17:29 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:59.805 06:17:29 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:59.805 06:17:29 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:59.805 06:17:29 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:59.805 06:17:29 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:59.806 06:17:29 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:59.806 06:17:29 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:59.806 06:17:29 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:59.806 06:17:29 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:59.806 06:17:29 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:59.806 06:17:29 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:59.806 06:17:29 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:59.806 06:17:29 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:59.806 06:17:29 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:59.806 06:17:29 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:59.806 06:17:29 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:59.806 06:17:29 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:59.806 06:17:29 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:59.806 06:17:29 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:59.806 06:17:29 -- common/autotest_common.sh@196 -- # cat 00:07:59.806 06:17:29 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:59.806 06:17:29 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:59.806 06:17:29 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:59.806 06:17:29 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:59.806 06:17:29 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:59.806 06:17:29 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:59.806 06:17:29 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:59.806 06:17:29 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:59.806 06:17:29 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:59.806 06:17:29 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:59.806 06:17:29 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:59.806 06:17:29 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:59.806 06:17:29 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:59.806 06:17:29 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:59.806 06:17:29 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:59.806 06:17:29 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:59.806 06:17:29 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:59.806 06:17:29 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:59.806 06:17:29 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:59.806 06:17:29 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:07:59.806 06:17:29 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:07:59.806 06:17:29 -- common/autotest_common.sh@249 -- # _LCOV= 00:07:59.806 06:17:29 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:07:59.806 06:17:29 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:07:59.806 06:17:29 -- common/autotest_common.sh@250 -- # _LCOV=1 00:07:59.806 06:17:29 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:59.806 06:17:29 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:07:59.806 06:17:29 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:59.806 06:17:29 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:07:59.806 06:17:29 -- common/autotest_common.sh@259 -- # export valgrind= 00:07:59.806 06:17:29 -- common/autotest_common.sh@259 -- # valgrind= 00:07:59.806 06:17:29 -- common/autotest_common.sh@265 -- # uname -s 00:07:59.806 06:17:29 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:07:59.806 06:17:29 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:07:59.806 06:17:29 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:07:59.806 06:17:29 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:07:59.806 06:17:29 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:59.806 06:17:29 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:59.806 06:17:29 -- common/autotest_common.sh@275 -- # MAKE=make 00:07:59.806 06:17:29 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:07:59.806 06:17:29 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:07:59.806 06:17:29 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:07:59.806 06:17:29 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:59.806 06:17:29 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:07:59.806 06:17:29 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:07:59.806 06:17:29 -- common/autotest_common.sh@319 -- # [[ -z 40187 ]] 00:07:59.806 06:17:29 -- common/autotest_common.sh@319 -- # kill -0 40187 00:07:59.806 06:17:29 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:07:59.806 06:17:29 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:07:59.806 06:17:29 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:07:59.806 06:17:29 -- common/autotest_common.sh@332 -- # local mount target_dir 00:07:59.806 06:17:29 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:07:59.806 06:17:29 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:07:59.806 06:17:29 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:07:59.806 06:17:29 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:07:59.806 06:17:29 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.5rXS5j 00:07:59.806 06:17:29 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:59.806 06:17:29 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:07:59.806 06:17:29 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:07:59.806 06:17:29 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.5rXS5j/tests/vfio /tmp/spdk.5rXS5j 00:07:59.806 06:17:29 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:07:59.806 06:17:29 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:59.806 06:17:29 -- common/autotest_common.sh@328 -- # df -T 00:07:59.806 06:17:29 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:07:59.806 06:17:29 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:07:59.806 06:17:29 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:07:59.806 06:17:29 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:07:59.806 06:17:29 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:07:59.806 06:17:29 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:07:59.806 06:17:29 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:59.806 06:17:29 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:07:59.806 06:17:29 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:07:59.806 06:17:29 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:07:59.806 06:17:29 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:07:59.806 06:17:29 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:07:59.806 06:17:29 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:59.806 06:17:29 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:07:59.806 06:17:29 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:07:59.806 06:17:29 -- common/autotest_common.sh@363 -- # avails["$mount"]=53299867648 00:07:59.806 06:17:29 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:07:59.806 06:17:29 -- common/autotest_common.sh@364 -- # uses["$mount"]=8430739456 00:07:59.806 06:17:29 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:59.806 06:17:29 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:59.806 06:17:29 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:59.806 06:17:29 -- common/autotest_common.sh@363 -- # avails["$mount"]=30862708736 00:07:59.806 06:17:29 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:07:59.806 06:17:29 -- common/autotest_common.sh@364 -- # uses["$mount"]=2592768 00:07:59.806 06:17:29 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:59.806 06:17:29 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:59.806 06:17:29 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:59.806 06:17:29 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340129792 00:07:59.806 06:17:29 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:07:59.806 06:17:29 -- common/autotest_common.sh@364 -- # uses["$mount"]=5992448 00:07:59.806 06:17:29 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:59.806 06:17:29 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:59.807 06:17:29 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:59.807 06:17:29 -- common/autotest_common.sh@363 -- # avails["$mount"]=30863708160 00:07:59.807 06:17:29 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:07:59.807 06:17:29 -- common/autotest_common.sh@364 -- # uses["$mount"]=1597440 00:07:59.807 06:17:29 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:59.807 06:17:29 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:59.807 06:17:29 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:59.807 06:17:29 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:07:59.807 06:17:29 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:07:59.807 06:17:29 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:07:59.807 06:17:29 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:59.807 06:17:29 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:07:59.807 * Looking for test storage... 00:07:59.807 06:17:29 -- common/autotest_common.sh@369 -- # local target_space new_size 00:07:59.807 06:17:29 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:07:59.807 06:17:29 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:59.807 06:17:29 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:59.807 06:17:29 -- common/autotest_common.sh@373 -- # mount=/ 00:07:59.807 06:17:29 -- common/autotest_common.sh@375 -- # target_space=53299867648 00:07:59.807 06:17:29 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:07:59.807 06:17:29 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:07:59.807 06:17:29 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:07:59.807 06:17:29 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:07:59.807 06:17:29 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:07:59.807 06:17:29 -- common/autotest_common.sh@382 -- # new_size=10645331968 00:07:59.807 06:17:29 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:59.807 06:17:29 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:59.807 06:17:29 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:59.807 06:17:29 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:59.807 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:59.807 06:17:29 -- common/autotest_common.sh@390 -- # return 0 00:07:59.807 06:17:29 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:07:59.807 06:17:29 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:07:59.807 06:17:29 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:59.807 06:17:29 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:59.807 06:17:29 -- common/autotest_common.sh@1682 -- # true 00:07:59.807 06:17:29 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:07:59.807 06:17:29 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:59.807 06:17:29 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:59.807 06:17:29 -- common/autotest_common.sh@27 -- # exec 00:07:59.807 06:17:29 -- common/autotest_common.sh@29 -- # exec 00:07:59.807 06:17:29 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:59.807 06:17:29 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:59.807 06:17:29 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:59.807 06:17:29 -- common/autotest_common.sh@18 -- # set -x 00:07:59.807 06:17:29 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:59.807 06:17:29 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:59.807 06:17:29 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:00.067 06:17:29 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:00.067 06:17:29 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:00.067 06:17:29 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:00.067 06:17:29 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:00.067 06:17:29 -- scripts/common.sh@335 -- # IFS=.-: 00:08:00.067 06:17:29 -- scripts/common.sh@335 -- # read -ra ver1 00:08:00.067 06:17:29 -- scripts/common.sh@336 -- # IFS=.-: 00:08:00.067 06:17:29 -- scripts/common.sh@336 -- # read -ra ver2 00:08:00.067 06:17:29 -- scripts/common.sh@337 -- # local 'op=<' 00:08:00.067 06:17:29 -- scripts/common.sh@339 -- # ver1_l=2 00:08:00.067 06:17:29 -- scripts/common.sh@340 -- # ver2_l=1 00:08:00.067 06:17:29 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:00.067 06:17:29 -- scripts/common.sh@343 -- # case "$op" in 00:08:00.067 06:17:29 -- scripts/common.sh@344 -- # : 1 00:08:00.067 06:17:29 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:00.067 06:17:29 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:00.067 06:17:29 -- scripts/common.sh@364 -- # decimal 1 00:08:00.067 06:17:29 -- scripts/common.sh@352 -- # local d=1 00:08:00.067 06:17:29 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:00.067 06:17:29 -- scripts/common.sh@354 -- # echo 1 00:08:00.067 06:17:29 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:00.067 06:17:29 -- scripts/common.sh@365 -- # decimal 2 00:08:00.067 06:17:29 -- scripts/common.sh@352 -- # local d=2 00:08:00.067 06:17:29 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:00.067 06:17:29 -- scripts/common.sh@354 -- # echo 2 00:08:00.067 06:17:29 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:00.067 06:17:29 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:00.067 06:17:29 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:00.067 06:17:29 -- scripts/common.sh@367 -- # return 0 00:08:00.067 06:17:29 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:00.067 06:17:29 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:00.067 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.067 --rc genhtml_branch_coverage=1 00:08:00.067 --rc genhtml_function_coverage=1 00:08:00.067 --rc genhtml_legend=1 00:08:00.067 --rc geninfo_all_blocks=1 00:08:00.067 --rc geninfo_unexecuted_blocks=1 00:08:00.067 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.067 ' 00:08:00.067 06:17:29 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:00.067 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.067 --rc genhtml_branch_coverage=1 00:08:00.067 --rc genhtml_function_coverage=1 00:08:00.067 --rc genhtml_legend=1 00:08:00.067 --rc geninfo_all_blocks=1 00:08:00.067 --rc geninfo_unexecuted_blocks=1 00:08:00.067 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.067 ' 00:08:00.067 06:17:29 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:00.067 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.067 --rc genhtml_branch_coverage=1 00:08:00.067 --rc genhtml_function_coverage=1 00:08:00.067 --rc genhtml_legend=1 00:08:00.067 --rc geninfo_all_blocks=1 00:08:00.067 --rc geninfo_unexecuted_blocks=1 00:08:00.067 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.067 ' 00:08:00.067 06:17:29 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:00.067 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.067 --rc genhtml_branch_coverage=1 00:08:00.067 --rc genhtml_function_coverage=1 00:08:00.067 --rc genhtml_legend=1 00:08:00.067 --rc geninfo_all_blocks=1 00:08:00.067 --rc geninfo_unexecuted_blocks=1 00:08:00.067 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.067 ' 00:08:00.067 06:17:29 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:00.067 06:17:29 -- ../common.sh@8 -- # pids=() 00:08:00.067 06:17:29 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:00.067 06:17:29 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:00.067 06:17:29 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:00.067 06:17:29 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:00.067 06:17:29 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:00.067 06:17:29 -- vfio/run.sh@65 -- # mem_size=0 00:08:00.067 06:17:29 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:00.067 06:17:29 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:00.067 06:17:29 -- ../common.sh@69 -- # local fuzz_num=7 00:08:00.067 06:17:29 -- ../common.sh@70 -- # local time=1 00:08:00.067 06:17:29 -- ../common.sh@72 -- # (( i = 0 )) 00:08:00.067 06:17:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.067 06:17:29 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:00.067 06:17:29 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:00.067 06:17:29 -- vfio/run.sh@23 -- # local timen=1 00:08:00.067 06:17:29 -- vfio/run.sh@24 -- # local core=0x1 00:08:00.067 06:17:29 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:00.067 06:17:29 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:00.068 06:17:29 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:00.068 06:17:29 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:00.068 06:17:29 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:00.068 06:17:29 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:00.068 06:17:29 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:00.068 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:00.068 06:17:29 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:00.068 [2024-11-27 06:17:29.398733] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:00.068 [2024-11-27 06:17:29.398793] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid40277 ] 00:08:00.068 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.068 [2024-11-27 06:17:29.467764] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.068 [2024-11-27 06:17:29.537321] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:00.068 [2024-11-27 06:17:29.537466] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.326 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.326 INFO: Seed: 1042664337 00:08:00.326 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:00.326 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:00.326 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:00.326 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.326 #2 INITED exec/s: 0 rss: 62Mb 00:08:00.326 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.326 This may also happen if the target rejected all inputs we tried so far 00:08:00.846 NEW_FUNC[1/630]: 0x43a218 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:00.846 NEW_FUNC[2/630]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:00.846 #7 NEW cov: 10755 ft: 10624 corp: 2/30b lim: 60 exec/s: 0 rss: 68Mb L: 29/29 MS: 5 ShuffleBytes-ShuffleBytes-CrossOver-EraseBytes-InsertRepeatedBytes- 00:08:01.108 NEW_FUNC[1/1]: 0x133a668 in q_addr /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:545 00:08:01.108 #9 NEW cov: 10779 ft: 14361 corp: 3/59b lim: 60 exec/s: 0 rss: 69Mb L: 29/29 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:01.108 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.108 #13 NEW cov: 10796 ft: 15680 corp: 4/79b lim: 60 exec/s: 0 rss: 70Mb L: 20/29 MS: 4 ChangeByte-ShuffleBytes-ChangeBinInt-CrossOver- 00:08:01.366 #19 NEW cov: 10796 ft: 16072 corp: 5/108b lim: 60 exec/s: 19 rss: 70Mb L: 29/29 MS: 1 ChangeBinInt- 00:08:01.625 #20 NEW cov: 10796 ft: 17163 corp: 6/137b lim: 60 exec/s: 20 rss: 70Mb L: 29/29 MS: 1 CopyPart- 00:08:01.884 #21 NEW cov: 10796 ft: 17697 corp: 7/166b lim: 60 exec/s: 21 rss: 70Mb L: 29/29 MS: 1 ShuffleBytes- 00:08:01.884 #22 NEW cov: 10796 ft: 17805 corp: 8/195b lim: 60 exec/s: 22 rss: 70Mb L: 29/29 MS: 1 ChangeByte- 00:08:02.143 #23 NEW cov: 10803 ft: 18146 corp: 9/225b lim: 60 exec/s: 23 rss: 70Mb L: 30/30 MS: 1 InsertByte- 00:08:02.402 #24 NEW cov: 10803 ft: 18256 corp: 10/254b lim: 60 exec/s: 12 rss: 70Mb L: 29/30 MS: 1 ChangeBinInt- 00:08:02.402 #24 DONE cov: 10803 ft: 18256 corp: 10/254b lim: 60 exec/s: 12 rss: 70Mb 00:08:02.402 Done 24 runs in 2 second(s) 00:08:02.662 06:17:32 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:02.662 06:17:32 -- ../common.sh@72 -- # (( i++ )) 00:08:02.662 06:17:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.662 06:17:32 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:02.662 06:17:32 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:02.662 06:17:32 -- vfio/run.sh@23 -- # local timen=1 00:08:02.662 06:17:32 -- vfio/run.sh@24 -- # local core=0x1 00:08:02.662 06:17:32 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:02.662 06:17:32 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:02.662 06:17:32 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:02.662 06:17:32 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:02.662 06:17:32 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:02.662 06:17:32 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:02.662 06:17:32 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:02.662 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:02.662 06:17:32 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:02.662 [2024-11-27 06:17:32.096891] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:02.662 [2024-11-27 06:17:32.096972] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid40789 ] 00:08:02.662 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.662 [2024-11-27 06:17:32.168713] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.921 [2024-11-27 06:17:32.238719] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:02.921 [2024-11-27 06:17:32.238862] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.921 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.921 INFO: Seed: 3746674823 00:08:02.921 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:02.921 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:02.921 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:02.921 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.921 #2 INITED exec/s: 0 rss: 61Mb 00:08:02.921 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:02.921 This may also happen if the target rejected all inputs we tried so far 00:08:03.179 [2024-11-27 06:17:32.532634] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:03.179 [2024-11-27 06:17:32.532668] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:03.179 [2024-11-27 06:17:32.532686] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:03.438 NEW_FUNC[1/638]: 0x43a7b8 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:03.438 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:03.438 #5 NEW cov: 10782 ft: 10750 corp: 2/10b lim: 40 exec/s: 0 rss: 68Mb L: 9/9 MS: 3 ChangeBit-CrossOver-CMP- DE: "\347>\221\307\236-\222\000"- 00:08:03.696 [2024-11-27 06:17:32.997367] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:03.696 [2024-11-27 06:17:32.997401] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:03.696 [2024-11-27 06:17:32.997419] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:03.696 #6 NEW cov: 10796 ft: 13951 corp: 3/20b lim: 40 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 InsertByte- 00:08:03.696 [2024-11-27 06:17:33.179769] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:03.696 [2024-11-27 06:17:33.179792] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:03.696 [2024-11-27 06:17:33.179809] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:03.955 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:03.955 #7 NEW cov: 10813 ft: 15245 corp: 4/30b lim: 40 exec/s: 0 rss: 70Mb L: 10/10 MS: 1 ChangeBit- 00:08:03.955 [2024-11-27 06:17:33.362089] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:03.955 [2024-11-27 06:17:33.362112] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:03.955 [2024-11-27 06:17:33.362129] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:03.955 #8 NEW cov: 10813 ft: 15623 corp: 5/39b lim: 40 exec/s: 8 rss: 70Mb L: 9/10 MS: 1 ShuffleBytes- 00:08:04.215 [2024-11-27 06:17:33.544046] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:04.215 [2024-11-27 06:17:33.544068] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:04.215 [2024-11-27 06:17:33.544086] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:04.215 #9 NEW cov: 10813 ft: 16441 corp: 6/49b lim: 40 exec/s: 9 rss: 70Mb L: 10/10 MS: 1 ChangeBinInt- 00:08:04.215 [2024-11-27 06:17:33.724383] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:04.215 [2024-11-27 06:17:33.724405] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:04.215 [2024-11-27 06:17:33.724422] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:04.473 #10 NEW cov: 10813 ft: 16554 corp: 7/60b lim: 40 exec/s: 10 rss: 70Mb L: 11/11 MS: 1 InsertByte- 00:08:04.473 [2024-11-27 06:17:33.907628] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:04.473 [2024-11-27 06:17:33.907651] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:04.473 [2024-11-27 06:17:33.907667] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:04.732 #12 NEW cov: 10813 ft: 16989 corp: 8/80b lim: 40 exec/s: 12 rss: 70Mb L: 20/20 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:04.732 [2024-11-27 06:17:34.100803] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:04.732 [2024-11-27 06:17:34.100827] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:04.732 [2024-11-27 06:17:34.100844] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:04.732 #13 NEW cov: 10813 ft: 17010 corp: 9/89b lim: 40 exec/s: 13 rss: 70Mb L: 9/20 MS: 1 ChangeBinInt- 00:08:04.991 [2024-11-27 06:17:34.283090] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:04.991 [2024-11-27 06:17:34.283113] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:04.991 [2024-11-27 06:17:34.283129] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:04.991 #14 NEW cov: 10820 ft: 17075 corp: 10/119b lim: 40 exec/s: 14 rss: 70Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:08:04.991 [2024-11-27 06:17:34.464391] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:04.991 [2024-11-27 06:17:34.464413] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:04.991 [2024-11-27 06:17:34.464429] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:05.251 #16 pulse cov: 10820 ft: 17176 corp: 10/119b lim: 40 exec/s: 8 rss: 70Mb 00:08:05.251 #16 NEW cov: 10820 ft: 17176 corp: 11/123b lim: 40 exec/s: 8 rss: 70Mb L: 4/30 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:05.251 #16 DONE cov: 10820 ft: 17176 corp: 11/123b lim: 40 exec/s: 8 rss: 70Mb 00:08:05.251 ###### Recommended dictionary. ###### 00:08:05.251 "\347>\221\307\236-\222\000" # Uses: 0 00:08:05.251 ###### End of recommended dictionary. ###### 00:08:05.251 Done 16 runs in 2 second(s) 00:08:05.510 06:17:34 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:05.510 06:17:34 -- ../common.sh@72 -- # (( i++ )) 00:08:05.510 06:17:34 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.510 06:17:34 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:05.510 06:17:34 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:05.510 06:17:34 -- vfio/run.sh@23 -- # local timen=1 00:08:05.510 06:17:34 -- vfio/run.sh@24 -- # local core=0x1 00:08:05.510 06:17:34 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:05.510 06:17:34 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:05.510 06:17:34 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:05.510 06:17:34 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:05.510 06:17:34 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:05.510 06:17:34 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:05.510 06:17:34 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:05.510 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:05.510 06:17:34 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:05.510 [2024-11-27 06:17:34.880367] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:05.510 [2024-11-27 06:17:34.880436] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid41333 ] 00:08:05.510 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.510 [2024-11-27 06:17:34.951578] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.510 [2024-11-27 06:17:35.019783] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.510 [2024-11-27 06:17:35.019944] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.769 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.769 INFO: Seed: 2230686681 00:08:05.769 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:05.769 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:05.769 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:05.769 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.769 #2 INITED exec/s: 0 rss: 62Mb 00:08:05.769 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.769 This may also happen if the target rejected all inputs we tried so far 00:08:06.029 [2024-11-27 06:17:35.308727] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:06.288 NEW_FUNC[1/636]: 0x43b1a8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:06.288 NEW_FUNC[2/636]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:06.288 #5 NEW cov: 10758 ft: 10722 corp: 2/10b lim: 80 exec/s: 0 rss: 66Mb L: 9/9 MS: 3 ShuffleBytes-CopyPart-CMP- DE: "\000\222-\240e\217Y\222"- 00:08:06.288 [2024-11-27 06:17:35.767841] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:06.547 #6 NEW cov: 10772 ft: 14188 corp: 3/19b lim: 80 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:06.547 [2024-11-27 06:17:35.949410] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:06.547 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.547 #7 NEW cov: 10789 ft: 15184 corp: 4/29b lim: 80 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 InsertByte- 00:08:06.806 [2024-11-27 06:17:36.131285] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:06.806 #8 NEW cov: 10789 ft: 15908 corp: 5/47b lim: 80 exec/s: 8 rss: 69Mb L: 18/18 MS: 1 PersAutoDict- DE: "\000\222-\240e\217Y\222"- 00:08:06.806 [2024-11-27 06:17:36.311833] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:07.065 #9 NEW cov: 10789 ft: 15949 corp: 6/66b lim: 80 exec/s: 9 rss: 69Mb L: 19/19 MS: 1 InsertByte- 00:08:07.065 [2024-11-27 06:17:36.494018] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:07.324 #10 NEW cov: 10789 ft: 16272 corp: 7/76b lim: 80 exec/s: 10 rss: 69Mb L: 10/19 MS: 1 ChangeBinInt- 00:08:07.324 [2024-11-27 06:17:36.674615] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:07.324 #11 NEW cov: 10789 ft: 16558 corp: 8/86b lim: 80 exec/s: 11 rss: 69Mb L: 10/19 MS: 1 ChangeByte- 00:08:07.324 [2024-11-27 06:17:36.856301] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:07.583 #12 NEW cov: 10789 ft: 16809 corp: 9/109b lim: 80 exec/s: 12 rss: 69Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:07.583 [2024-11-27 06:17:37.037049] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:07.842 #13 NEW cov: 10796 ft: 16980 corp: 10/128b lim: 80 exec/s: 13 rss: 70Mb L: 19/23 MS: 1 CrossOver- 00:08:07.842 [2024-11-27 06:17:37.219204] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:07.842 #14 NEW cov: 10796 ft: 17178 corp: 11/138b lim: 80 exec/s: 7 rss: 70Mb L: 10/23 MS: 1 InsertByte- 00:08:07.842 #14 DONE cov: 10796 ft: 17178 corp: 11/138b lim: 80 exec/s: 7 rss: 70Mb 00:08:07.842 ###### Recommended dictionary. ###### 00:08:07.842 "\000\222-\240e\217Y\222" # Uses: 1 00:08:07.842 ###### End of recommended dictionary. ###### 00:08:07.842 Done 14 runs in 2 second(s) 00:08:08.102 06:17:37 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:08.102 06:17:37 -- ../common.sh@72 -- # (( i++ )) 00:08:08.102 06:17:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.102 06:17:37 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:08.102 06:17:37 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:08.102 06:17:37 -- vfio/run.sh@23 -- # local timen=1 00:08:08.102 06:17:37 -- vfio/run.sh@24 -- # local core=0x1 00:08:08.102 06:17:37 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:08.102 06:17:37 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:08.102 06:17:37 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:08.102 06:17:37 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:08.102 06:17:37 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:08.102 06:17:37 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:08.102 06:17:37 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:08.102 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:08.102 06:17:37 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:08.361 [2024-11-27 06:17:37.638536] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:08.361 [2024-11-27 06:17:37.638610] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid41872 ] 00:08:08.361 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.361 [2024-11-27 06:17:37.711061] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.361 [2024-11-27 06:17:37.780320] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:08.361 [2024-11-27 06:17:37.780479] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.620 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.620 INFO: Seed: 695710298 00:08:08.620 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:08.620 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:08.620 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:08.620 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.620 #2 INITED exec/s: 0 rss: 61Mb 00:08:08.620 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.620 This may also happen if the target rejected all inputs we tried so far 00:08:09.138 NEW_FUNC[1/620]: 0x43b898 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:09.138 NEW_FUNC[2/620]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:09.138 #9 NEW cov: 10600 ft: 10717 corp: 2/89b lim: 320 exec/s: 0 rss: 66Mb L: 88/88 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:09.138 NEW_FUNC[1/12]: 0x13537c8 in handle_cmd_rsp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:2498 00:08:09.138 NEW_FUNC[2/12]: 0x15d3c58 in _is_io_flags_valid /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ns_cmd.c:141 00:08:09.138 #10 NEW cov: 10766 ft: 14347 corp: 3/177b lim: 320 exec/s: 0 rss: 68Mb L: 88/88 MS: 1 CrossOver- 00:08:09.397 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:09.397 #11 NEW cov: 10783 ft: 15040 corp: 4/238b lim: 320 exec/s: 0 rss: 69Mb L: 61/88 MS: 1 EraseBytes- 00:08:09.656 #12 NEW cov: 10783 ft: 15463 corp: 5/300b lim: 320 exec/s: 12 rss: 69Mb L: 62/88 MS: 1 InsertByte- 00:08:09.921 #13 NEW cov: 10783 ft: 15744 corp: 6/343b lim: 320 exec/s: 13 rss: 69Mb L: 43/88 MS: 1 EraseBytes- 00:08:09.921 #14 NEW cov: 10783 ft: 15956 corp: 7/431b lim: 320 exec/s: 14 rss: 69Mb L: 88/88 MS: 1 ChangeBit- 00:08:10.179 #15 NEW cov: 10783 ft: 16155 corp: 8/493b lim: 320 exec/s: 15 rss: 69Mb L: 62/88 MS: 1 ShuffleBytes- 00:08:10.438 #16 NEW cov: 10790 ft: 16233 corp: 9/581b lim: 320 exec/s: 16 rss: 69Mb L: 88/88 MS: 1 ChangeBinInt- 00:08:10.697 #17 NEW cov: 10790 ft: 16402 corp: 10/669b lim: 320 exec/s: 8 rss: 69Mb L: 88/88 MS: 1 CopyPart- 00:08:10.697 #17 DONE cov: 10790 ft: 16402 corp: 10/669b lim: 320 exec/s: 8 rss: 69Mb 00:08:10.697 Done 17 runs in 2 second(s) 00:08:10.957 06:17:40 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:10.957 06:17:40 -- ../common.sh@72 -- # (( i++ )) 00:08:10.957 06:17:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.957 06:17:40 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:10.957 06:17:40 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:10.957 06:17:40 -- vfio/run.sh@23 -- # local timen=1 00:08:10.957 06:17:40 -- vfio/run.sh@24 -- # local core=0x1 00:08:10.957 06:17:40 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:10.957 06:17:40 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:10.957 06:17:40 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:10.957 06:17:40 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:10.957 06:17:40 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:10.957 06:17:40 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:10.957 06:17:40 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:10.957 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:10.957 06:17:40 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:10.957 [2024-11-27 06:17:40.305220] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:10.957 [2024-11-27 06:17:40.305315] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid42195 ] 00:08:10.957 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.957 [2024-11-27 06:17:40.379333] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.957 [2024-11-27 06:17:40.451200] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.957 [2024-11-27 06:17:40.451346] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.216 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.216 INFO: Seed: 3371724724 00:08:11.216 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:11.216 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:11.216 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:11.216 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.216 #2 INITED exec/s: 0 rss: 62Mb 00:08:11.216 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:11.216 This may also happen if the target rejected all inputs we tried so far 00:08:11.734 NEW_FUNC[1/632]: 0x43c118 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:11.734 NEW_FUNC[2/632]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:11.734 #3 NEW cov: 10747 ft: 10540 corp: 2/85b lim: 320 exec/s: 0 rss: 67Mb L: 84/84 MS: 1 InsertRepeatedBytes- 00:08:11.993 #9 NEW cov: 10761 ft: 13317 corp: 3/135b lim: 320 exec/s: 0 rss: 68Mb L: 50/84 MS: 1 EraseBytes- 00:08:12.252 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:12.253 #10 NEW cov: 10781 ft: 14430 corp: 4/219b lim: 320 exec/s: 0 rss: 69Mb L: 84/84 MS: 1 ChangeByte- 00:08:12.253 #14 NEW cov: 10781 ft: 15025 corp: 5/305b lim: 320 exec/s: 14 rss: 70Mb L: 86/86 MS: 4 ChangeByte-ShuffleBytes-CopyPart-CrossOver- 00:08:12.512 #15 NEW cov: 10781 ft: 15199 corp: 6/475b lim: 320 exec/s: 15 rss: 70Mb L: 170/170 MS: 1 CrossOver- 00:08:12.770 #16 NEW cov: 10781 ft: 15774 corp: 7/635b lim: 320 exec/s: 16 rss: 70Mb L: 160/170 MS: 1 InsertRepeatedBytes- 00:08:13.029 #17 NEW cov: 10781 ft: 15862 corp: 8/670b lim: 320 exec/s: 17 rss: 70Mb L: 35/170 MS: 1 CrossOver- 00:08:13.029 #18 NEW cov: 10788 ft: 15885 corp: 9/754b lim: 320 exec/s: 18 rss: 70Mb L: 84/170 MS: 1 ChangeBinInt- 00:08:13.289 #19 NEW cov: 10788 ft: 16087 corp: 10/804b lim: 320 exec/s: 9 rss: 70Mb L: 50/170 MS: 1 ChangeBit- 00:08:13.289 #19 DONE cov: 10788 ft: 16087 corp: 10/804b lim: 320 exec/s: 9 rss: 70Mb 00:08:13.289 Done 19 runs in 2 second(s) 00:08:13.549 06:17:42 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:13.549 06:17:42 -- ../common.sh@72 -- # (( i++ )) 00:08:13.549 06:17:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.549 06:17:42 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:13.549 06:17:42 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:13.549 06:17:42 -- vfio/run.sh@23 -- # local timen=1 00:08:13.549 06:17:42 -- vfio/run.sh@24 -- # local core=0x1 00:08:13.549 06:17:42 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:13.549 06:17:42 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:13.549 06:17:42 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:13.549 06:17:42 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:13.549 06:17:42 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:13.549 06:17:42 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:13.549 06:17:43 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:13.549 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:13.549 06:17:43 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:13.549 [2024-11-27 06:17:43.035344] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:13.549 [2024-11-27 06:17:43.035437] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid42715 ] 00:08:13.549 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.808 [2024-11-27 06:17:43.107508] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.808 [2024-11-27 06:17:43.176585] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:13.808 [2024-11-27 06:17:43.176750] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.067 INFO: Running with entropic power schedule (0xFF, 100). 00:08:14.067 INFO: Seed: 1798749203 00:08:14.067 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:14.067 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:14.067 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:14.067 INFO: A corpus is not provided, starting from an empty corpus 00:08:14.067 #2 INITED exec/s: 0 rss: 61Mb 00:08:14.067 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:14.067 This may also happen if the target rejected all inputs we tried so far 00:08:14.067 [2024-11-27 06:17:43.471629] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:14.067 [2024-11-27 06:17:43.471671] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:14.326 NEW_FUNC[1/637]: 0x43cb18 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:14.326 NEW_FUNC[2/637]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:14.326 #3 NEW cov: 10771 ft: 10735 corp: 2/80b lim: 120 exec/s: 0 rss: 67Mb L: 79/79 MS: 1 InsertRepeatedBytes- 00:08:14.584 [2024-11-27 06:17:43.939726] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:14.584 [2024-11-27 06:17:43.939780] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:14.584 NEW_FUNC[1/1]: 0x1675708 in _nvme_qpair_complete_abort_queued_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:593 00:08:14.584 #9 NEW cov: 10798 ft: 13808 corp: 3/159b lim: 120 exec/s: 0 rss: 68Mb L: 79/79 MS: 1 ChangeBit- 00:08:14.842 [2024-11-27 06:17:44.124299] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:14.843 [2024-11-27 06:17:44.124328] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:14.843 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:14.843 #20 NEW cov: 10815 ft: 14632 corp: 4/238b lim: 120 exec/s: 0 rss: 69Mb L: 79/79 MS: 1 ChangeBit- 00:08:14.843 [2024-11-27 06:17:44.310831] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:14.843 [2024-11-27 06:17:44.310860] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:15.100 #22 NEW cov: 10815 ft: 15204 corp: 5/318b lim: 120 exec/s: 22 rss: 69Mb L: 80/80 MS: 2 ShuffleBytes-CrossOver- 00:08:15.100 [2024-11-27 06:17:44.505302] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:15.100 [2024-11-27 06:17:44.505333] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:15.100 #33 NEW cov: 10815 ft: 15461 corp: 6/397b lim: 120 exec/s: 33 rss: 69Mb L: 79/80 MS: 1 ChangeBinInt- 00:08:15.359 [2024-11-27 06:17:44.689627] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:15.359 [2024-11-27 06:17:44.689657] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:15.359 #36 NEW cov: 10815 ft: 15584 corp: 7/469b lim: 120 exec/s: 36 rss: 69Mb L: 72/80 MS: 3 InsertByte-CopyPart-CrossOver- 00:08:15.359 [2024-11-27 06:17:44.873152] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:15.359 [2024-11-27 06:17:44.873182] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:15.619 #37 NEW cov: 10815 ft: 15794 corp: 8/549b lim: 120 exec/s: 37 rss: 69Mb L: 80/80 MS: 1 ChangeByte- 00:08:15.619 [2024-11-27 06:17:45.054623] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:15.619 [2024-11-27 06:17:45.054654] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:15.878 #38 NEW cov: 10815 ft: 15955 corp: 9/611b lim: 120 exec/s: 38 rss: 69Mb L: 62/80 MS: 1 EraseBytes- 00:08:15.878 [2024-11-27 06:17:45.239993] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:15.878 [2024-11-27 06:17:45.240024] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:15.878 #39 NEW cov: 10822 ft: 15975 corp: 10/659b lim: 120 exec/s: 39 rss: 70Mb L: 48/80 MS: 1 EraseBytes- 00:08:16.138 [2024-11-27 06:17:45.425371] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:16.138 [2024-11-27 06:17:45.425401] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:16.138 #45 NEW cov: 10822 ft: 16016 corp: 11/738b lim: 120 exec/s: 22 rss: 70Mb L: 79/80 MS: 1 ShuffleBytes- 00:08:16.138 #45 DONE cov: 10822 ft: 16016 corp: 11/738b lim: 120 exec/s: 22 rss: 70Mb 00:08:16.138 Done 45 runs in 2 second(s) 00:08:16.398 06:17:45 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:16.398 06:17:45 -- ../common.sh@72 -- # (( i++ )) 00:08:16.398 06:17:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:16.398 06:17:45 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:16.398 06:17:45 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:16.398 06:17:45 -- vfio/run.sh@23 -- # local timen=1 00:08:16.398 06:17:45 -- vfio/run.sh@24 -- # local core=0x1 00:08:16.398 06:17:45 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:16.398 06:17:45 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:16.398 06:17:45 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:16.398 06:17:45 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:16.398 06:17:45 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:16.398 06:17:45 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:16.398 06:17:45 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:16.398 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:16.398 06:17:45 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:16.398 [2024-11-27 06:17:45.843799] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:16.398 [2024-11-27 06:17:45.843893] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid43264 ] 00:08:16.398 EAL: No free 2048 kB hugepages reported on node 1 00:08:16.398 [2024-11-27 06:17:45.915298] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.658 [2024-11-27 06:17:45.985593] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:16.658 [2024-11-27 06:17:45.985762] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.658 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.658 INFO: Seed: 309801708 00:08:16.658 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:16.658 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:16.658 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:16.658 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.658 #2 INITED exec/s: 0 rss: 62Mb 00:08:16.658 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.658 This may also happen if the target rejected all inputs we tried so far 00:08:16.917 [2024-11-27 06:17:46.293790] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:16.917 [2024-11-27 06:17:46.293867] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:17.176 NEW_FUNC[1/638]: 0x43d808 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:17.176 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:17.176 #11 NEW cov: 10773 ft: 10597 corp: 2/14b lim: 90 exec/s: 0 rss: 67Mb L: 13/13 MS: 4 InsertByte-InsertByte-CopyPart-CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:17.436 [2024-11-27 06:17:46.794901] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:17.436 [2024-11-27 06:17:46.794944] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:17.436 #13 NEW cov: 10787 ft: 13265 corp: 3/34b lim: 90 exec/s: 0 rss: 69Mb L: 20/20 MS: 2 InsertRepeatedBytes-CrossOver- 00:08:17.695 [2024-11-27 06:17:47.003537] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:17.695 [2024-11-27 06:17:47.003568] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:17.695 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:17.695 #14 NEW cov: 10807 ft: 15451 corp: 4/54b lim: 90 exec/s: 0 rss: 70Mb L: 20/20 MS: 1 CMP- DE: "\362\364\300\004\000\000\000\000"- 00:08:17.695 [2024-11-27 06:17:47.207844] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:17.695 [2024-11-27 06:17:47.207875] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:17.954 #15 NEW cov: 10807 ft: 16533 corp: 5/75b lim: 90 exec/s: 15 rss: 70Mb L: 21/21 MS: 1 InsertByte- 00:08:17.954 [2024-11-27 06:17:47.407014] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:17.954 [2024-11-27 06:17:47.407043] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:18.214 #16 NEW cov: 10807 ft: 16999 corp: 6/104b lim: 90 exec/s: 16 rss: 71Mb L: 29/29 MS: 1 CopyPart- 00:08:18.214 [2024-11-27 06:17:47.612474] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:18.214 [2024-11-27 06:17:47.612508] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:18.214 #18 NEW cov: 10807 ft: 17225 corp: 7/134b lim: 90 exec/s: 18 rss: 71Mb L: 30/30 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:18.473 [2024-11-27 06:17:47.826223] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:18.473 [2024-11-27 06:17:47.826253] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:18.473 #19 NEW cov: 10807 ft: 17406 corp: 8/160b lim: 90 exec/s: 19 rss: 71Mb L: 26/30 MS: 1 EraseBytes- 00:08:18.732 [2024-11-27 06:17:48.027255] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:18.732 [2024-11-27 06:17:48.027287] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:18.732 #20 NEW cov: 10814 ft: 17657 corp: 9/189b lim: 90 exec/s: 20 rss: 71Mb L: 29/30 MS: 1 ChangeASCIIInt- 00:08:18.732 [2024-11-27 06:17:48.231271] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:18.732 [2024-11-27 06:17:48.231302] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:18.992 #21 NEW cov: 10814 ft: 17762 corp: 10/209b lim: 90 exec/s: 10 rss: 71Mb L: 20/30 MS: 1 CMP- DE: "\012\000"- 00:08:18.992 #21 DONE cov: 10814 ft: 17762 corp: 10/209b lim: 90 exec/s: 10 rss: 71Mb 00:08:18.992 ###### Recommended dictionary. ###### 00:08:18.992 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:18.992 "\362\364\300\004\000\000\000\000" # Uses: 0 00:08:18.992 "\012\000" # Uses: 0 00:08:18.992 ###### End of recommended dictionary. ###### 00:08:18.992 Done 21 runs in 2 second(s) 00:08:19.251 06:17:48 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:19.251 06:17:48 -- ../common.sh@72 -- # (( i++ )) 00:08:19.251 06:17:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:19.251 06:17:48 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:19.251 00:08:19.251 real 0m19.679s 00:08:19.251 user 0m27.775s 00:08:19.251 sys 0m1.840s 00:08:19.251 06:17:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:19.251 06:17:48 -- common/autotest_common.sh@10 -- # set +x 00:08:19.251 ************************************ 00:08:19.251 END TEST vfio_fuzz 00:08:19.251 ************************************ 00:08:19.251 00:08:19.251 real 1m23.745s 00:08:19.251 user 2m8.709s 00:08:19.251 sys 0m8.755s 00:08:19.251 06:17:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:19.251 06:17:48 -- common/autotest_common.sh@10 -- # set +x 00:08:19.251 ************************************ 00:08:19.251 END TEST llvm_fuzz 00:08:19.251 ************************************ 00:08:19.251 06:17:48 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:08:19.251 06:17:48 -- spdk/autotest.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:08:19.251 06:17:48 -- spdk/autotest.sh@372 -- # timing_enter post_cleanup 00:08:19.251 06:17:48 -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:19.251 06:17:48 -- common/autotest_common.sh@10 -- # set +x 00:08:19.251 06:17:48 -- spdk/autotest.sh@373 -- # autotest_cleanup 00:08:19.251 06:17:48 -- common/autotest_common.sh@1381 -- # local autotest_es=0 00:08:19.251 06:17:48 -- common/autotest_common.sh@1382 -- # xtrace_disable 00:08:19.252 06:17:48 -- common/autotest_common.sh@10 -- # set +x 00:08:25.910 INFO: APP EXITING 00:08:25.910 INFO: killing all VMs 00:08:25.910 INFO: killing vhost app 00:08:25.910 INFO: EXIT DONE 00:08:28.450 Waiting for block devices as requested 00:08:28.450 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:28.450 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:28.450 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:28.727 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:28.727 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:28.727 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:28.987 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:28.987 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:28.987 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:28.987 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:29.246 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:29.246 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:29.246 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:29.506 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:29.506 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:29.506 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:29.766 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:08:33.961 Cleaning 00:08:33.961 Removing: /dev/shm/spdk_tgt_trace.pid5272 00:08:33.961 Removing: /var/run/dpdk/spdk_pid12504 00:08:33.961 Removing: /var/run/dpdk/spdk_pid12874 00:08:33.961 Removing: /var/run/dpdk/spdk_pid13524 00:08:33.961 Removing: /var/run/dpdk/spdk_pid13819 00:08:33.961 Removing: /var/run/dpdk/spdk_pid14393 00:08:33.961 Removing: /var/run/dpdk/spdk_pid14409 00:08:33.961 Removing: /var/run/dpdk/spdk_pid14976 00:08:33.961 Removing: /var/run/dpdk/spdk_pid15248 00:08:33.962 Removing: /var/run/dpdk/spdk_pid15548 00:08:33.962 Removing: /var/run/dpdk/spdk_pid15570 00:08:33.962 Removing: /var/run/dpdk/spdk_pid15860 00:08:33.962 Removing: /var/run/dpdk/spdk_pid16023 00:08:33.962 Removing: /var/run/dpdk/spdk_pid16508 00:08:33.962 Removing: /var/run/dpdk/spdk_pid16794 00:08:33.962 Removing: /var/run/dpdk/spdk_pid17079 00:08:33.962 Removing: /var/run/dpdk/spdk_pid17188 00:08:33.962 Removing: /var/run/dpdk/spdk_pid17467 00:08:33.962 Removing: /var/run/dpdk/spdk_pid17609 00:08:33.962 Removing: /var/run/dpdk/spdk_pid17796 00:08:33.962 Removing: /var/run/dpdk/spdk_pid18073 00:08:33.962 Removing: /var/run/dpdk/spdk_pid18305 00:08:33.962 Removing: /var/run/dpdk/spdk_pid18468 00:08:33.962 Removing: /var/run/dpdk/spdk_pid18681 00:08:33.962 Removing: /var/run/dpdk/spdk_pid18931 00:08:33.962 Removing: /var/run/dpdk/spdk_pid19218 00:08:33.962 Removing: /var/run/dpdk/spdk_pid19486 00:08:33.962 Removing: /var/run/dpdk/spdk_pid19772 00:08:33.962 Removing: /var/run/dpdk/spdk_pid20040 00:08:33.962 Removing: /var/run/dpdk/spdk_pid20266 00:08:33.962 Removing: /var/run/dpdk/spdk_pid20442 00:08:33.962 Removing: /var/run/dpdk/spdk_pid20654 00:08:33.962 Removing: /var/run/dpdk/spdk_pid20906 00:08:33.962 Removing: /var/run/dpdk/spdk_pid21194 00:08:33.962 Removing: /var/run/dpdk/spdk_pid21462 00:08:33.962 Removing: /var/run/dpdk/spdk_pid21744 00:08:33.962 Removing: /var/run/dpdk/spdk_pid22020 00:08:33.962 Removing: /var/run/dpdk/spdk_pid22247 00:08:33.962 Removing: /var/run/dpdk/spdk_pid22407 00:08:33.962 Removing: /var/run/dpdk/spdk_pid22618 00:08:33.962 Removing: /var/run/dpdk/spdk_pid22878 00:08:33.962 Removing: /var/run/dpdk/spdk_pid23163 00:08:33.962 Removing: /var/run/dpdk/spdk_pid23432 00:08:33.962 Removing: /var/run/dpdk/spdk_pid23721 00:08:33.962 Removing: /var/run/dpdk/spdk_pid23987 00:08:33.962 Removing: /var/run/dpdk/spdk_pid24231 00:08:33.962 Removing: /var/run/dpdk/spdk_pid24404 00:08:33.962 Removing: /var/run/dpdk/spdk_pid24603 00:08:33.962 Removing: /var/run/dpdk/spdk_pid24849 00:08:33.962 Removing: /var/run/dpdk/spdk_pid25136 00:08:33.962 Removing: /var/run/dpdk/spdk_pid25408 00:08:33.962 Removing: /var/run/dpdk/spdk_pid25689 00:08:33.962 Removing: /var/run/dpdk/spdk_pid25970 00:08:33.962 Removing: /var/run/dpdk/spdk_pid26230 00:08:33.962 Removing: /var/run/dpdk/spdk_pid2624 00:08:33.962 Removing: /var/run/dpdk/spdk_pid26405 00:08:33.962 Removing: /var/run/dpdk/spdk_pid26618 00:08:33.962 Removing: /var/run/dpdk/spdk_pid26842 00:08:33.962 Removing: /var/run/dpdk/spdk_pid27129 00:08:33.962 Removing: /var/run/dpdk/spdk_pid27402 00:08:33.962 Removing: /var/run/dpdk/spdk_pid27686 00:08:33.962 Removing: /var/run/dpdk/spdk_pid27785 00:08:33.962 Removing: /var/run/dpdk/spdk_pid28203 00:08:33.962 Removing: /var/run/dpdk/spdk_pid28863 00:08:33.962 Removing: /var/run/dpdk/spdk_pid29405 00:08:33.962 Removing: /var/run/dpdk/spdk_pid29765 00:08:33.962 Removing: /var/run/dpdk/spdk_pid30239 00:08:33.962 Removing: /var/run/dpdk/spdk_pid30781 00:08:33.962 Removing: /var/run/dpdk/spdk_pid31082 00:08:33.962 Removing: /var/run/dpdk/spdk_pid31618 00:08:33.962 Removing: /var/run/dpdk/spdk_pid32085 00:08:33.962 Removing: /var/run/dpdk/spdk_pid32441 00:08:33.962 Removing: /var/run/dpdk/spdk_pid32985 00:08:33.962 Removing: /var/run/dpdk/spdk_pid33417 00:08:33.962 Removing: /var/run/dpdk/spdk_pid33828 00:08:33.962 Removing: /var/run/dpdk/spdk_pid34365 00:08:33.962 Removing: /var/run/dpdk/spdk_pid34766 00:08:33.962 Removing: /var/run/dpdk/spdk_pid35196 00:08:33.962 Removing: /var/run/dpdk/spdk_pid35744 00:08:33.962 Removing: /var/run/dpdk/spdk_pid36055 00:08:33.962 Removing: /var/run/dpdk/spdk_pid36577 00:08:33.962 Removing: /var/run/dpdk/spdk_pid37083 00:08:33.962 Removing: /var/run/dpdk/spdk_pid37414 00:08:33.962 Removing: /var/run/dpdk/spdk_pid37950 00:08:33.962 Removing: /var/run/dpdk/spdk_pid38377 00:08:33.962 Removing: /var/run/dpdk/spdk_pid38784 00:08:33.962 Removing: /var/run/dpdk/spdk_pid39324 00:08:33.962 Removing: /var/run/dpdk/spdk_pid39706 00:08:33.962 Removing: /var/run/dpdk/spdk_pid40277 00:08:33.962 Removing: /var/run/dpdk/spdk_pid4044 00:08:33.962 Removing: /var/run/dpdk/spdk_pid40789 00:08:33.962 Removing: /var/run/dpdk/spdk_pid41333 00:08:33.962 Removing: /var/run/dpdk/spdk_pid41872 00:08:33.962 Removing: /var/run/dpdk/spdk_pid42195 00:08:33.962 Removing: /var/run/dpdk/spdk_pid42715 00:08:33.962 Removing: /var/run/dpdk/spdk_pid43264 00:08:33.962 Removing: /var/run/dpdk/spdk_pid5272 00:08:33.962 Removing: /var/run/dpdk/spdk_pid6072 00:08:33.962 Removing: /var/run/dpdk/spdk_pid6397 00:08:33.962 Removing: /var/run/dpdk/spdk_pid6731 00:08:33.962 Removing: /var/run/dpdk/spdk_pid7077 00:08:33.962 Removing: /var/run/dpdk/spdk_pid7409 00:08:33.962 Removing: /var/run/dpdk/spdk_pid7696 00:08:33.962 Removing: /var/run/dpdk/spdk_pid7978 00:08:33.962 Removing: /var/run/dpdk/spdk_pid8301 00:08:33.962 Removing: /var/run/dpdk/spdk_pid9181 00:08:33.962 Clean 00:08:33.962 killing process with pid 4148280 00:08:37.253 killing process with pid 4148277 00:08:37.253 killing process with pid 4148279 00:08:37.253 killing process with pid 4148278 00:08:37.253 06:18:06 -- common/autotest_common.sh@1446 -- # return 0 00:08:37.253 06:18:06 -- spdk/autotest.sh@374 -- # timing_exit post_cleanup 00:08:37.253 06:18:06 -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:37.253 06:18:06 -- common/autotest_common.sh@10 -- # set +x 00:08:37.513 06:18:06 -- spdk/autotest.sh@376 -- # timing_exit autotest 00:08:37.513 06:18:06 -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:37.513 06:18:06 -- common/autotest_common.sh@10 -- # set +x 00:08:37.513 06:18:06 -- spdk/autotest.sh@377 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:37.513 06:18:06 -- spdk/autotest.sh@379 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:08:37.513 06:18:06 -- spdk/autotest.sh@379 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:08:37.513 06:18:06 -- spdk/autotest.sh@381 -- # [[ y == y ]] 00:08:37.513 06:18:06 -- spdk/autotest.sh@383 -- # hostname 00:08:37.513 06:18:06 -- spdk/autotest.sh@383 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:08:37.772 geninfo: WARNING: invalid characters removed from testname! 00:08:38.341 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcda 00:08:38.341 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcda 00:08:38.341 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcda 00:08:50.568 06:18:18 -- spdk/autotest.sh@384 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:08:55.844 06:18:25 -- spdk/autotest.sh@385 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:01.120 06:18:29 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:05.316 06:18:34 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:10.590 06:18:39 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:14.788 06:18:43 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:20.067 06:18:48 -- spdk/autotest.sh@393 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:20.067 06:18:48 -- common/autotest_common.sh@1689 -- $ [[ y == y ]] 00:09:20.067 06:18:48 -- common/autotest_common.sh@1690 -- $ awk '{print $NF}' 00:09:20.067 06:18:48 -- common/autotest_common.sh@1690 -- $ lcov --version 00:09:20.067 06:18:48 -- common/autotest_common.sh@1690 -- $ lt 1.15 2 00:09:20.067 06:18:48 -- scripts/common.sh@372 -- $ cmp_versions 1.15 '<' 2 00:09:20.067 06:18:48 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:09:20.067 06:18:48 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:09:20.067 06:18:48 -- scripts/common.sh@335 -- $ IFS=.-: 00:09:20.067 06:18:48 -- scripts/common.sh@335 -- $ read -ra ver1 00:09:20.067 06:18:48 -- scripts/common.sh@336 -- $ IFS=.-: 00:09:20.067 06:18:48 -- scripts/common.sh@336 -- $ read -ra ver2 00:09:20.067 06:18:48 -- scripts/common.sh@337 -- $ local 'op=<' 00:09:20.067 06:18:48 -- scripts/common.sh@339 -- $ ver1_l=2 00:09:20.067 06:18:48 -- scripts/common.sh@340 -- $ ver2_l=1 00:09:20.067 06:18:48 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:09:20.067 06:18:48 -- scripts/common.sh@343 -- $ case "$op" in 00:09:20.067 06:18:48 -- scripts/common.sh@344 -- $ : 1 00:09:20.067 06:18:48 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:09:20.067 06:18:48 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:20.067 06:18:48 -- scripts/common.sh@364 -- $ decimal 1 00:09:20.067 06:18:48 -- scripts/common.sh@352 -- $ local d=1 00:09:20.067 06:18:48 -- scripts/common.sh@353 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:09:20.067 06:18:48 -- scripts/common.sh@354 -- $ echo 1 00:09:20.067 06:18:48 -- scripts/common.sh@364 -- $ ver1[v]=1 00:09:20.067 06:18:48 -- scripts/common.sh@365 -- $ decimal 2 00:09:20.067 06:18:48 -- scripts/common.sh@352 -- $ local d=2 00:09:20.067 06:18:48 -- scripts/common.sh@353 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:09:20.067 06:18:48 -- scripts/common.sh@354 -- $ echo 2 00:09:20.067 06:18:48 -- scripts/common.sh@365 -- $ ver2[v]=2 00:09:20.067 06:18:48 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:09:20.067 06:18:48 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:09:20.067 06:18:48 -- scripts/common.sh@367 -- $ return 0 00:09:20.067 06:18:48 -- common/autotest_common.sh@1691 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:20.067 06:18:48 -- common/autotest_common.sh@1703 -- $ export 'LCOV_OPTS= 00:09:20.067 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:20.067 --rc genhtml_branch_coverage=1 00:09:20.067 --rc genhtml_function_coverage=1 00:09:20.067 --rc genhtml_legend=1 00:09:20.067 --rc geninfo_all_blocks=1 00:09:20.067 --rc geninfo_unexecuted_blocks=1 00:09:20.067 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:20.067 ' 00:09:20.067 06:18:48 -- common/autotest_common.sh@1703 -- $ LCOV_OPTS=' 00:09:20.067 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:20.067 --rc genhtml_branch_coverage=1 00:09:20.067 --rc genhtml_function_coverage=1 00:09:20.067 --rc genhtml_legend=1 00:09:20.067 --rc geninfo_all_blocks=1 00:09:20.068 --rc geninfo_unexecuted_blocks=1 00:09:20.068 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:20.068 ' 00:09:20.068 06:18:48 -- common/autotest_common.sh@1704 -- $ export 'LCOV=lcov 00:09:20.068 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:20.068 --rc genhtml_branch_coverage=1 00:09:20.068 --rc genhtml_function_coverage=1 00:09:20.068 --rc genhtml_legend=1 00:09:20.068 --rc geninfo_all_blocks=1 00:09:20.068 --rc geninfo_unexecuted_blocks=1 00:09:20.068 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:20.068 ' 00:09:20.068 06:18:48 -- common/autotest_common.sh@1704 -- $ LCOV='lcov 00:09:20.068 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:20.068 --rc genhtml_branch_coverage=1 00:09:20.068 --rc genhtml_function_coverage=1 00:09:20.068 --rc genhtml_legend=1 00:09:20.068 --rc geninfo_all_blocks=1 00:09:20.068 --rc geninfo_unexecuted_blocks=1 00:09:20.068 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:20.068 ' 00:09:20.068 06:18:48 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:20.068 06:18:48 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:20.068 06:18:48 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:20.068 06:18:48 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:20.068 06:18:48 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:20.068 06:18:48 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:20.068 06:18:48 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:20.068 06:18:48 -- paths/export.sh@5 -- $ export PATH 00:09:20.068 06:18:48 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:20.068 06:18:48 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:20.068 06:18:48 -- common/autobuild_common.sh@440 -- $ date +%s 00:09:20.068 06:18:48 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1732684728.XXXXXX 00:09:20.068 06:18:48 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1732684728.EGjRfM 00:09:20.068 06:18:48 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:09:20.068 06:18:48 -- common/autobuild_common.sh@446 -- $ '[' -n '' ']' 00:09:20.068 06:18:48 -- common/autobuild_common.sh@449 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:09:20.068 06:18:48 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:20.068 06:18:48 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:20.068 06:18:48 -- common/autobuild_common.sh@456 -- $ get_config_params 00:09:20.068 06:18:48 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:09:20.068 06:18:48 -- common/autotest_common.sh@10 -- $ set +x 00:09:20.068 06:18:48 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:09:20.068 06:18:48 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:20.068 06:18:48 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:20.068 06:18:48 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:20.068 06:18:48 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:20.068 06:18:48 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:20.068 06:18:48 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:20.068 06:18:48 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:20.068 06:18:48 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:20.068 06:18:48 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:20.068 06:18:48 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:20.068 + [[ -n 4104895 ]] 00:09:20.068 + sudo kill 4104895 00:09:20.077 [Pipeline] } 00:09:20.091 [Pipeline] // stage 00:09:20.096 [Pipeline] } 00:09:20.109 [Pipeline] // timeout 00:09:20.114 [Pipeline] } 00:09:20.128 [Pipeline] // catchError 00:09:20.133 [Pipeline] } 00:09:20.150 [Pipeline] // wrap 00:09:20.156 [Pipeline] } 00:09:20.170 [Pipeline] // catchError 00:09:20.180 [Pipeline] stage 00:09:20.183 [Pipeline] { (Epilogue) 00:09:20.194 [Pipeline] catchError 00:09:20.196 [Pipeline] { 00:09:20.207 [Pipeline] echo 00:09:20.208 Cleanup processes 00:09:20.213 [Pipeline] sh 00:09:20.498 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:20.498 53072 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:20.512 [Pipeline] sh 00:09:20.798 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:20.799 ++ grep -v 'sudo pgrep' 00:09:20.799 ++ awk '{print $1}' 00:09:20.799 + sudo kill -9 00:09:20.799 + true 00:09:20.810 [Pipeline] sh 00:09:21.092 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:21.092 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:21.092 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:22.468 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:32.573 [Pipeline] sh 00:09:32.856 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:32.856 Artifacts sizes are good 00:09:32.871 [Pipeline] archiveArtifacts 00:09:32.878 Archiving artifacts 00:09:33.014 [Pipeline] sh 00:09:33.296 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:33.311 [Pipeline] cleanWs 00:09:33.321 [WS-CLEANUP] Deleting project workspace... 00:09:33.321 [WS-CLEANUP] Deferred wipeout is used... 00:09:33.328 [WS-CLEANUP] done 00:09:33.330 [Pipeline] } 00:09:33.347 [Pipeline] // catchError 00:09:33.360 [Pipeline] sh 00:09:33.643 + logger -p user.info -t JENKINS-CI 00:09:33.654 [Pipeline] } 00:09:33.668 [Pipeline] // stage 00:09:33.673 [Pipeline] } 00:09:33.688 [Pipeline] // node 00:09:33.694 [Pipeline] End of Pipeline 00:09:33.731 Finished: SUCCESS